Oct  7 15:12:44 np0005474864 kernel: Linux version 5.14.0-620.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-11), GNU ld version 2.35.2-67.el9) #1 SMP PREEMPT_DYNAMIC Fri Sep 26 01:13:23 UTC 2025
Oct  7 15:12:44 np0005474864 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Oct  7 15:12:44 np0005474864 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  7 15:12:44 np0005474864 kernel: BIOS-provided physical RAM map:
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Oct  7 15:12:44 np0005474864 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Oct  7 15:12:44 np0005474864 kernel: NX (Execute Disable) protection: active
Oct  7 15:12:44 np0005474864 kernel: APIC: Static calls initialized
Oct  7 15:12:44 np0005474864 kernel: SMBIOS 2.8 present.
Oct  7 15:12:44 np0005474864 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Oct  7 15:12:44 np0005474864 kernel: Hypervisor detected: KVM
Oct  7 15:12:44 np0005474864 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Oct  7 15:12:44 np0005474864 kernel: kvm-clock: using sched offset of 5303742981 cycles
Oct  7 15:12:44 np0005474864 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Oct  7 15:12:44 np0005474864 kernel: tsc: Detected 2800.000 MHz processor
Oct  7 15:12:44 np0005474864 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Oct  7 15:12:44 np0005474864 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Oct  7 15:12:44 np0005474864 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Oct  7 15:12:44 np0005474864 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Oct  7 15:12:44 np0005474864 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Oct  7 15:12:44 np0005474864 kernel: Using GB pages for direct mapping
Oct  7 15:12:44 np0005474864 kernel: RAMDISK: [mem 0x2d7c4000-0x32bd9fff]
Oct  7 15:12:44 np0005474864 kernel: ACPI: Early table checksum verification disabled
Oct  7 15:12:44 np0005474864 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Oct  7 15:12:44 np0005474864 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  7 15:12:44 np0005474864 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  7 15:12:44 np0005474864 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  7 15:12:44 np0005474864 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Oct  7 15:12:44 np0005474864 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  7 15:12:44 np0005474864 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Oct  7 15:12:44 np0005474864 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Oct  7 15:12:44 np0005474864 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Oct  7 15:12:44 np0005474864 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Oct  7 15:12:44 np0005474864 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Oct  7 15:12:44 np0005474864 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Oct  7 15:12:44 np0005474864 kernel: No NUMA configuration found
Oct  7 15:12:44 np0005474864 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Oct  7 15:12:44 np0005474864 kernel: NODE_DATA(0) allocated [mem 0x23ffd3000-0x23fffdfff]
Oct  7 15:12:44 np0005474864 kernel: crashkernel reserved: 0x00000000a2000000 - 0x00000000b2000000 (256 MB)
Oct  7 15:12:44 np0005474864 kernel: Zone ranges:
Oct  7 15:12:44 np0005474864 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Oct  7 15:12:44 np0005474864 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Oct  7 15:12:44 np0005474864 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Oct  7 15:12:44 np0005474864 kernel:  Device   empty
Oct  7 15:12:44 np0005474864 kernel: Movable zone start for each node
Oct  7 15:12:44 np0005474864 kernel: Early memory node ranges
Oct  7 15:12:44 np0005474864 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Oct  7 15:12:44 np0005474864 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Oct  7 15:12:44 np0005474864 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Oct  7 15:12:44 np0005474864 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Oct  7 15:12:44 np0005474864 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Oct  7 15:12:44 np0005474864 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Oct  7 15:12:44 np0005474864 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Oct  7 15:12:44 np0005474864 kernel: ACPI: PM-Timer IO Port: 0x608
Oct  7 15:12:44 np0005474864 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Oct  7 15:12:44 np0005474864 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Oct  7 15:12:44 np0005474864 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Oct  7 15:12:44 np0005474864 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Oct  7 15:12:44 np0005474864 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Oct  7 15:12:44 np0005474864 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Oct  7 15:12:44 np0005474864 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Oct  7 15:12:44 np0005474864 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Oct  7 15:12:44 np0005474864 kernel: TSC deadline timer available
Oct  7 15:12:44 np0005474864 kernel: CPU topo: Max. logical packages:   8
Oct  7 15:12:44 np0005474864 kernel: CPU topo: Max. logical dies:       8
Oct  7 15:12:44 np0005474864 kernel: CPU topo: Max. dies per package:   1
Oct  7 15:12:44 np0005474864 kernel: CPU topo: Max. threads per core:   1
Oct  7 15:12:44 np0005474864 kernel: CPU topo: Num. cores per package:     1
Oct  7 15:12:44 np0005474864 kernel: CPU topo: Num. threads per package:   1
Oct  7 15:12:44 np0005474864 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Oct  7 15:12:44 np0005474864 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Oct  7 15:12:44 np0005474864 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Oct  7 15:12:44 np0005474864 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Oct  7 15:12:44 np0005474864 kernel: Booting paravirtualized kernel on KVM
Oct  7 15:12:44 np0005474864 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Oct  7 15:12:44 np0005474864 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Oct  7 15:12:44 np0005474864 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Oct  7 15:12:44 np0005474864 kernel: kvm-guest: PV spinlocks disabled, no host support
Oct  7 15:12:44 np0005474864 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  7 15:12:44 np0005474864 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64", will be passed to user space.
Oct  7 15:12:44 np0005474864 kernel: random: crng init done
Oct  7 15:12:44 np0005474864 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: Fallback order for Node 0: 0 
Oct  7 15:12:44 np0005474864 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Oct  7 15:12:44 np0005474864 kernel: Policy zone: Normal
Oct  7 15:12:44 np0005474864 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Oct  7 15:12:44 np0005474864 kernel: software IO TLB: area num 8.
Oct  7 15:12:44 np0005474864 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Oct  7 15:12:44 np0005474864 kernel: ftrace: allocating 49370 entries in 193 pages
Oct  7 15:12:44 np0005474864 kernel: ftrace: allocated 193 pages with 3 groups
Oct  7 15:12:44 np0005474864 kernel: Dynamic Preempt: voluntary
Oct  7 15:12:44 np0005474864 kernel: rcu: Preemptible hierarchical RCU implementation.
Oct  7 15:12:44 np0005474864 kernel: rcu: #011RCU event tracing is enabled.
Oct  7 15:12:44 np0005474864 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Oct  7 15:12:44 np0005474864 kernel: #011Trampoline variant of Tasks RCU enabled.
Oct  7 15:12:44 np0005474864 kernel: #011Rude variant of Tasks RCU enabled.
Oct  7 15:12:44 np0005474864 kernel: #011Tracing variant of Tasks RCU enabled.
Oct  7 15:12:44 np0005474864 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Oct  7 15:12:44 np0005474864 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Oct  7 15:12:44 np0005474864 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  7 15:12:44 np0005474864 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  7 15:12:44 np0005474864 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Oct  7 15:12:44 np0005474864 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Oct  7 15:12:44 np0005474864 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Oct  7 15:12:44 np0005474864 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Oct  7 15:12:44 np0005474864 kernel: Console: colour VGA+ 80x25
Oct  7 15:12:44 np0005474864 kernel: printk: console [ttyS0] enabled
Oct  7 15:12:44 np0005474864 kernel: ACPI: Core revision 20230331
Oct  7 15:12:44 np0005474864 kernel: APIC: Switch to symmetric I/O mode setup
Oct  7 15:12:44 np0005474864 kernel: x2apic enabled
Oct  7 15:12:44 np0005474864 kernel: APIC: Switched APIC routing to: physical x2apic
Oct  7 15:12:44 np0005474864 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Oct  7 15:12:44 np0005474864 kernel: Calibrating delay loop (skipped) preset value.. 5600.00 BogoMIPS (lpj=2800000)
Oct  7 15:12:44 np0005474864 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Oct  7 15:12:44 np0005474864 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Oct  7 15:12:44 np0005474864 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Oct  7 15:12:44 np0005474864 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Oct  7 15:12:44 np0005474864 kernel: Spectre V2 : Mitigation: Retpolines
Oct  7 15:12:44 np0005474864 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Oct  7 15:12:44 np0005474864 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Oct  7 15:12:44 np0005474864 kernel: RETBleed: Mitigation: untrained return thunk
Oct  7 15:12:44 np0005474864 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Oct  7 15:12:44 np0005474864 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Oct  7 15:12:44 np0005474864 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Oct  7 15:12:44 np0005474864 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Oct  7 15:12:44 np0005474864 kernel: x86/bugs: return thunk changed
Oct  7 15:12:44 np0005474864 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Oct  7 15:12:44 np0005474864 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Oct  7 15:12:44 np0005474864 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Oct  7 15:12:44 np0005474864 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Oct  7 15:12:44 np0005474864 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Oct  7 15:12:44 np0005474864 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Oct  7 15:12:44 np0005474864 kernel: Freeing SMP alternatives memory: 40K
Oct  7 15:12:44 np0005474864 kernel: pid_max: default: 32768 minimum: 301
Oct  7 15:12:44 np0005474864 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Oct  7 15:12:44 np0005474864 kernel: landlock: Up and running.
Oct  7 15:12:44 np0005474864 kernel: Yama: becoming mindful.
Oct  7 15:12:44 np0005474864 kernel: SELinux:  Initializing.
Oct  7 15:12:44 np0005474864 kernel: LSM support for eBPF active
Oct  7 15:12:44 np0005474864 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Oct  7 15:12:44 np0005474864 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Oct  7 15:12:44 np0005474864 kernel: ... version:                0
Oct  7 15:12:44 np0005474864 kernel: ... bit width:              48
Oct  7 15:12:44 np0005474864 kernel: ... generic registers:      6
Oct  7 15:12:44 np0005474864 kernel: ... value mask:             0000ffffffffffff
Oct  7 15:12:44 np0005474864 kernel: ... max period:             00007fffffffffff
Oct  7 15:12:44 np0005474864 kernel: ... fixed-purpose events:   0
Oct  7 15:12:44 np0005474864 kernel: ... event mask:             000000000000003f
Oct  7 15:12:44 np0005474864 kernel: signal: max sigframe size: 1776
Oct  7 15:12:44 np0005474864 kernel: rcu: Hierarchical SRCU implementation.
Oct  7 15:12:44 np0005474864 kernel: rcu: #011Max phase no-delay instances is 400.
Oct  7 15:12:44 np0005474864 kernel: smp: Bringing up secondary CPUs ...
Oct  7 15:12:44 np0005474864 kernel: smpboot: x86: Booting SMP configuration:
Oct  7 15:12:44 np0005474864 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Oct  7 15:12:44 np0005474864 kernel: smp: Brought up 1 node, 8 CPUs
Oct  7 15:12:44 np0005474864 kernel: smpboot: Total of 8 processors activated (44800.00 BogoMIPS)
Oct  7 15:12:44 np0005474864 kernel: node 0 deferred pages initialised in 18ms
Oct  7 15:12:44 np0005474864 kernel: Memory: 7765392K/8388068K available (16384K kernel code, 5784K rwdata, 13996K rodata, 4068K init, 7304K bss, 616512K reserved, 0K cma-reserved)
Oct  7 15:12:44 np0005474864 kernel: devtmpfs: initialized
Oct  7 15:12:44 np0005474864 kernel: x86/mm: Memory block size: 128MB
Oct  7 15:12:44 np0005474864 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Oct  7 15:12:44 np0005474864 kernel: futex hash table entries: 2048 (order: 5, 131072 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: pinctrl core: initialized pinctrl subsystem
Oct  7 15:12:44 np0005474864 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Oct  7 15:12:44 np0005474864 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Oct  7 15:12:44 np0005474864 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Oct  7 15:12:44 np0005474864 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Oct  7 15:12:44 np0005474864 kernel: audit: initializing netlink subsys (disabled)
Oct  7 15:12:44 np0005474864 kernel: audit: type=2000 audit(1759864363.441:1): state=initialized audit_enabled=0 res=1
Oct  7 15:12:44 np0005474864 kernel: thermal_sys: Registered thermal governor 'fair_share'
Oct  7 15:12:44 np0005474864 kernel: thermal_sys: Registered thermal governor 'step_wise'
Oct  7 15:12:44 np0005474864 kernel: thermal_sys: Registered thermal governor 'user_space'
Oct  7 15:12:44 np0005474864 kernel: cpuidle: using governor menu
Oct  7 15:12:44 np0005474864 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Oct  7 15:12:44 np0005474864 kernel: PCI: Using configuration type 1 for base access
Oct  7 15:12:44 np0005474864 kernel: PCI: Using configuration type 1 for extended access
Oct  7 15:12:44 np0005474864 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Oct  7 15:12:44 np0005474864 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Oct  7 15:12:44 np0005474864 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Oct  7 15:12:44 np0005474864 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Oct  7 15:12:44 np0005474864 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Oct  7 15:12:44 np0005474864 kernel: Demotion targets for Node 0: null
Oct  7 15:12:44 np0005474864 kernel: cryptd: max_cpu_qlen set to 1000
Oct  7 15:12:44 np0005474864 kernel: ACPI: Added _OSI(Module Device)
Oct  7 15:12:44 np0005474864 kernel: ACPI: Added _OSI(Processor Device)
Oct  7 15:12:44 np0005474864 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Oct  7 15:12:44 np0005474864 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Oct  7 15:12:44 np0005474864 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Oct  7 15:12:44 np0005474864 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Oct  7 15:12:44 np0005474864 kernel: ACPI: Interpreter enabled
Oct  7 15:12:44 np0005474864 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Oct  7 15:12:44 np0005474864 kernel: ACPI: Using IOAPIC for interrupt routing
Oct  7 15:12:44 np0005474864 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Oct  7 15:12:44 np0005474864 kernel: PCI: Using E820 reservations for host bridge windows
Oct  7 15:12:44 np0005474864 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Oct  7 15:12:44 np0005474864 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Oct  7 15:12:44 np0005474864 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [3] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [4] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [5] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [6] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [7] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [8] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [9] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [10] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [11] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [12] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [13] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [14] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [15] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [16] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [17] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [18] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [19] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [20] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [21] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [22] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [23] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [24] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [25] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [26] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [27] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [28] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [29] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [30] registered
Oct  7 15:12:44 np0005474864 kernel: acpiphp: Slot [31] registered
Oct  7 15:12:44 np0005474864 kernel: PCI host bridge to bus 0000:00
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Oct  7 15:12:44 np0005474864 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Oct  7 15:12:44 np0005474864 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Oct  7 15:12:44 np0005474864 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Oct  7 15:12:44 np0005474864 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Oct  7 15:12:44 np0005474864 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Oct  7 15:12:44 np0005474864 kernel: iommu: Default domain type: Translated
Oct  7 15:12:44 np0005474864 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Oct  7 15:12:44 np0005474864 kernel: SCSI subsystem initialized
Oct  7 15:12:44 np0005474864 kernel: ACPI: bus type USB registered
Oct  7 15:12:44 np0005474864 kernel: usbcore: registered new interface driver usbfs
Oct  7 15:12:44 np0005474864 kernel: usbcore: registered new interface driver hub
Oct  7 15:12:44 np0005474864 kernel: usbcore: registered new device driver usb
Oct  7 15:12:44 np0005474864 kernel: pps_core: LinuxPPS API ver. 1 registered
Oct  7 15:12:44 np0005474864 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Oct  7 15:12:44 np0005474864 kernel: PTP clock support registered
Oct  7 15:12:44 np0005474864 kernel: EDAC MC: Ver: 3.0.0
Oct  7 15:12:44 np0005474864 kernel: NetLabel: Initializing
Oct  7 15:12:44 np0005474864 kernel: NetLabel:  domain hash size = 128
Oct  7 15:12:44 np0005474864 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Oct  7 15:12:44 np0005474864 kernel: NetLabel:  unlabeled traffic allowed by default
Oct  7 15:12:44 np0005474864 kernel: PCI: Using ACPI for IRQ routing
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Oct  7 15:12:44 np0005474864 kernel: vgaarb: loaded
Oct  7 15:12:44 np0005474864 kernel: clocksource: Switched to clocksource kvm-clock
Oct  7 15:12:44 np0005474864 kernel: VFS: Disk quotas dquot_6.6.0
Oct  7 15:12:44 np0005474864 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Oct  7 15:12:44 np0005474864 kernel: pnp: PnP ACPI init
Oct  7 15:12:44 np0005474864 kernel: pnp: PnP ACPI: found 5 devices
Oct  7 15:12:44 np0005474864 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Oct  7 15:12:44 np0005474864 kernel: NET: Registered PF_INET protocol family
Oct  7 15:12:44 np0005474864 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Oct  7 15:12:44 np0005474864 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Oct  7 15:12:44 np0005474864 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Oct  7 15:12:44 np0005474864 kernel: NET: Registered PF_XDP protocol family
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Oct  7 15:12:44 np0005474864 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Oct  7 15:12:44 np0005474864 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Oct  7 15:12:44 np0005474864 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x140 took 75481 usecs
Oct  7 15:12:44 np0005474864 kernel: PCI: CLS 0 bytes, default 64
Oct  7 15:12:44 np0005474864 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Oct  7 15:12:44 np0005474864 kernel: software IO TLB: mapped [mem 0x00000000bbfdb000-0x00000000bffdb000] (64MB)
Oct  7 15:12:44 np0005474864 kernel: ACPI: bus type thunderbolt registered
Oct  7 15:12:44 np0005474864 kernel: Trying to unpack rootfs image as initramfs...
Oct  7 15:12:44 np0005474864 kernel: Initialise system trusted keyrings
Oct  7 15:12:44 np0005474864 kernel: Key type blacklist registered
Oct  7 15:12:44 np0005474864 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Oct  7 15:12:44 np0005474864 kernel: zbud: loaded
Oct  7 15:12:44 np0005474864 kernel: integrity: Platform Keyring initialized
Oct  7 15:12:44 np0005474864 kernel: integrity: Machine keyring initialized
Oct  7 15:12:44 np0005474864 kernel: Freeing initrd memory: 86104K
Oct  7 15:12:44 np0005474864 kernel: NET: Registered PF_ALG protocol family
Oct  7 15:12:44 np0005474864 kernel: xor: automatically using best checksumming function   avx       
Oct  7 15:12:44 np0005474864 kernel: Key type asymmetric registered
Oct  7 15:12:44 np0005474864 kernel: Asymmetric key parser 'x509' registered
Oct  7 15:12:44 np0005474864 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Oct  7 15:12:44 np0005474864 kernel: io scheduler mq-deadline registered
Oct  7 15:12:44 np0005474864 kernel: io scheduler kyber registered
Oct  7 15:12:44 np0005474864 kernel: io scheduler bfq registered
Oct  7 15:12:44 np0005474864 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Oct  7 15:12:44 np0005474864 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Oct  7 15:12:44 np0005474864 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Oct  7 15:12:44 np0005474864 kernel: ACPI: button: Power Button [PWRF]
Oct  7 15:12:44 np0005474864 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Oct  7 15:12:44 np0005474864 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Oct  7 15:12:44 np0005474864 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Oct  7 15:12:44 np0005474864 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Oct  7 15:12:44 np0005474864 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Oct  7 15:12:44 np0005474864 kernel: Non-volatile memory driver v1.3
Oct  7 15:12:44 np0005474864 kernel: rdac: device handler registered
Oct  7 15:12:44 np0005474864 kernel: hp_sw: device handler registered
Oct  7 15:12:44 np0005474864 kernel: emc: device handler registered
Oct  7 15:12:44 np0005474864 kernel: alua: device handler registered
Oct  7 15:12:44 np0005474864 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Oct  7 15:12:44 np0005474864 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Oct  7 15:12:44 np0005474864 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Oct  7 15:12:44 np0005474864 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Oct  7 15:12:44 np0005474864 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Oct  7 15:12:44 np0005474864 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Oct  7 15:12:44 np0005474864 kernel: usb usb1: Product: UHCI Host Controller
Oct  7 15:12:44 np0005474864 kernel: usb usb1: Manufacturer: Linux 5.14.0-620.el9.x86_64 uhci_hcd
Oct  7 15:12:44 np0005474864 kernel: usb usb1: SerialNumber: 0000:00:01.2
Oct  7 15:12:44 np0005474864 kernel: hub 1-0:1.0: USB hub found
Oct  7 15:12:44 np0005474864 kernel: hub 1-0:1.0: 2 ports detected
Oct  7 15:12:44 np0005474864 kernel: usbcore: registered new interface driver usbserial_generic
Oct  7 15:12:44 np0005474864 kernel: usbserial: USB Serial support registered for generic
Oct  7 15:12:44 np0005474864 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Oct  7 15:12:44 np0005474864 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Oct  7 15:12:44 np0005474864 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Oct  7 15:12:44 np0005474864 kernel: mousedev: PS/2 mouse device common for all mice
Oct  7 15:12:44 np0005474864 kernel: rtc_cmos 00:04: RTC can wake from S4
Oct  7 15:12:44 np0005474864 kernel: rtc_cmos 00:04: registered as rtc0
Oct  7 15:12:44 np0005474864 kernel: rtc_cmos 00:04: setting system clock to 2025-10-07T19:12:43 UTC (1759864363)
Oct  7 15:12:44 np0005474864 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Oct  7 15:12:44 np0005474864 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Oct  7 15:12:44 np0005474864 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Oct  7 15:12:44 np0005474864 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Oct  7 15:12:44 np0005474864 kernel: hid: raw HID events driver (C) Jiri Kosina
Oct  7 15:12:44 np0005474864 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Oct  7 15:12:44 np0005474864 kernel: usbcore: registered new interface driver usbhid
Oct  7 15:12:44 np0005474864 kernel: usbhid: USB HID core driver
Oct  7 15:12:44 np0005474864 kernel: drop_monitor: Initializing network drop monitor service
Oct  7 15:12:44 np0005474864 kernel: Initializing XFRM netlink socket
Oct  7 15:12:44 np0005474864 kernel: NET: Registered PF_INET6 protocol family
Oct  7 15:12:44 np0005474864 kernel: Segment Routing with IPv6
Oct  7 15:12:44 np0005474864 kernel: NET: Registered PF_PACKET protocol family
Oct  7 15:12:44 np0005474864 kernel: mpls_gso: MPLS GSO support
Oct  7 15:12:44 np0005474864 kernel: IPI shorthand broadcast: enabled
Oct  7 15:12:44 np0005474864 kernel: AVX2 version of gcm_enc/dec engaged.
Oct  7 15:12:44 np0005474864 kernel: AES CTR mode by8 optimization enabled
Oct  7 15:12:44 np0005474864 kernel: sched_clock: Marking stable (1162002880, 149856560)->(1429412870, -117553430)
Oct  7 15:12:44 np0005474864 kernel: registered taskstats version 1
Oct  7 15:12:44 np0005474864 kernel: Loading compiled-in X.509 certificates
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Oct  7 15:12:44 np0005474864 kernel: Demotion targets for Node 0: null
Oct  7 15:12:44 np0005474864 kernel: page_owner is disabled
Oct  7 15:12:44 np0005474864 kernel: Key type .fscrypt registered
Oct  7 15:12:44 np0005474864 kernel: Key type fscrypt-provisioning registered
Oct  7 15:12:44 np0005474864 kernel: Key type big_key registered
Oct  7 15:12:44 np0005474864 kernel: Key type encrypted registered
Oct  7 15:12:44 np0005474864 kernel: ima: No TPM chip found, activating TPM-bypass!
Oct  7 15:12:44 np0005474864 kernel: Loading compiled-in module X.509 certificates
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: 4ff821c4997fbb659836adb05f5bc400c914e148'
Oct  7 15:12:44 np0005474864 kernel: ima: Allocated hash algorithm: sha256
Oct  7 15:12:44 np0005474864 kernel: ima: No architecture policies found
Oct  7 15:12:44 np0005474864 kernel: evm: Initialising EVM extended attributes:
Oct  7 15:12:44 np0005474864 kernel: evm: security.selinux
Oct  7 15:12:44 np0005474864 kernel: evm: security.SMACK64 (disabled)
Oct  7 15:12:44 np0005474864 kernel: evm: security.SMACK64EXEC (disabled)
Oct  7 15:12:44 np0005474864 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Oct  7 15:12:44 np0005474864 kernel: evm: security.SMACK64MMAP (disabled)
Oct  7 15:12:44 np0005474864 kernel: evm: security.apparmor (disabled)
Oct  7 15:12:44 np0005474864 kernel: evm: security.ima
Oct  7 15:12:44 np0005474864 kernel: evm: security.capability
Oct  7 15:12:44 np0005474864 kernel: evm: HMAC attrs: 0x1
Oct  7 15:12:44 np0005474864 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Oct  7 15:12:44 np0005474864 kernel: Running certificate verification RSA selftest
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Oct  7 15:12:44 np0005474864 kernel: Running certificate verification ECDSA selftest
Oct  7 15:12:44 np0005474864 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Oct  7 15:12:44 np0005474864 kernel: clk: Disabling unused clocks
Oct  7 15:12:44 np0005474864 kernel: Freeing unused decrypted memory: 2028K
Oct  7 15:12:44 np0005474864 kernel: Freeing unused kernel image (initmem) memory: 4068K
Oct  7 15:12:44 np0005474864 kernel: Write protecting the kernel read-only data: 30720k
Oct  7 15:12:44 np0005474864 kernel: Freeing unused kernel image (rodata/data gap) memory: 340K
Oct  7 15:12:44 np0005474864 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Oct  7 15:12:44 np0005474864 kernel: Run /init as init process
Oct  7 15:12:44 np0005474864 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  7 15:12:44 np0005474864 systemd: Detected virtualization kvm.
Oct  7 15:12:44 np0005474864 systemd: Detected architecture x86-64.
Oct  7 15:12:44 np0005474864 systemd: Running in initrd.
Oct  7 15:12:44 np0005474864 systemd: No hostname configured, using default hostname.
Oct  7 15:12:44 np0005474864 systemd: Hostname set to <localhost>.
Oct  7 15:12:44 np0005474864 systemd: Initializing machine ID from VM UUID.
Oct  7 15:12:44 np0005474864 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Oct  7 15:12:44 np0005474864 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Oct  7 15:12:44 np0005474864 kernel: usb 1-1: Product: QEMU USB Tablet
Oct  7 15:12:44 np0005474864 kernel: usb 1-1: Manufacturer: QEMU
Oct  7 15:12:44 np0005474864 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Oct  7 15:12:44 np0005474864 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Oct  7 15:12:44 np0005474864 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Oct  7 15:12:44 np0005474864 systemd: Queued start job for default target Initrd Default Target.
Oct  7 15:12:44 np0005474864 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  7 15:12:44 np0005474864 systemd: Reached target Local Encrypted Volumes.
Oct  7 15:12:44 np0005474864 systemd: Reached target Initrd /usr File System.
Oct  7 15:12:44 np0005474864 systemd: Reached target Local File Systems.
Oct  7 15:12:44 np0005474864 systemd: Reached target Path Units.
Oct  7 15:12:44 np0005474864 systemd: Reached target Slice Units.
Oct  7 15:12:44 np0005474864 systemd: Reached target Swaps.
Oct  7 15:12:44 np0005474864 systemd: Reached target Timer Units.
Oct  7 15:12:44 np0005474864 systemd: Listening on D-Bus System Message Bus Socket.
Oct  7 15:12:44 np0005474864 systemd: Listening on Journal Socket (/dev/log).
Oct  7 15:12:44 np0005474864 systemd: Listening on Journal Socket.
Oct  7 15:12:44 np0005474864 systemd: Listening on udev Control Socket.
Oct  7 15:12:44 np0005474864 systemd: Listening on udev Kernel Socket.
Oct  7 15:12:44 np0005474864 systemd: Reached target Socket Units.
Oct  7 15:12:44 np0005474864 systemd: Starting Create List of Static Device Nodes...
Oct  7 15:12:44 np0005474864 systemd: Starting Journal Service...
Oct  7 15:12:44 np0005474864 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  7 15:12:44 np0005474864 systemd: Starting Apply Kernel Variables...
Oct  7 15:12:44 np0005474864 systemd: Starting Create System Users...
Oct  7 15:12:44 np0005474864 systemd: Starting Setup Virtual Console...
Oct  7 15:12:44 np0005474864 systemd: Finished Create List of Static Device Nodes.
Oct  7 15:12:44 np0005474864 systemd: Finished Apply Kernel Variables.
Oct  7 15:12:44 np0005474864 systemd: Finished Create System Users.
Oct  7 15:12:44 np0005474864 systemd-journald[308]: Journal started
Oct  7 15:12:44 np0005474864 systemd-journald[308]: Runtime Journal (/run/log/journal/2e9c4e5f05064565be4b95bb9b08ebdc) is 8.0M, max 153.5M, 145.5M free.
Oct  7 15:12:44 np0005474864 systemd-sysusers[312]: Creating group 'users' with GID 100.
Oct  7 15:12:44 np0005474864 systemd-sysusers[312]: Creating group 'dbus' with GID 81.
Oct  7 15:12:44 np0005474864 systemd-sysusers[312]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Oct  7 15:12:44 np0005474864 systemd: Started Journal Service.
Oct  7 15:12:44 np0005474864 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  7 15:12:44 np0005474864 systemd[1]: Starting Create Volatile Files and Directories...
Oct  7 15:12:44 np0005474864 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  7 15:12:44 np0005474864 systemd[1]: Finished Create Volatile Files and Directories.
Oct  7 15:12:44 np0005474864 systemd[1]: Finished Setup Virtual Console.
Oct  7 15:12:44 np0005474864 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Oct  7 15:12:44 np0005474864 systemd[1]: Starting dracut cmdline hook...
Oct  7 15:12:44 np0005474864 dracut-cmdline[328]: dracut-9 dracut-057-102.git20250818.el9
Oct  7 15:12:44 np0005474864 dracut-cmdline[328]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-620.el9.x86_64 root=UUID=1631a6ad-43b8-436d-ae76-16fa14b94458 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Oct  7 15:12:44 np0005474864 systemd[1]: Finished dracut cmdline hook.
Oct  7 15:12:44 np0005474864 systemd[1]: Starting dracut pre-udev hook...
Oct  7 15:12:44 np0005474864 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Oct  7 15:12:44 np0005474864 kernel: device-mapper: uevent: version 1.0.3
Oct  7 15:12:44 np0005474864 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Oct  7 15:12:44 np0005474864 kernel: RPC: Registered named UNIX socket transport module.
Oct  7 15:12:44 np0005474864 kernel: RPC: Registered udp transport module.
Oct  7 15:12:44 np0005474864 kernel: RPC: Registered tcp transport module.
Oct  7 15:12:44 np0005474864 kernel: RPC: Registered tcp-with-tls transport module.
Oct  7 15:12:44 np0005474864 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Oct  7 15:12:44 np0005474864 rpc.statd[446]: Version 2.5.4 starting
Oct  7 15:12:44 np0005474864 rpc.statd[446]: Initializing NSM state
Oct  7 15:12:44 np0005474864 rpc.idmapd[451]: Setting log level to 0
Oct  7 15:12:44 np0005474864 systemd[1]: Finished dracut pre-udev hook.
Oct  7 15:12:44 np0005474864 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  7 15:12:44 np0005474864 systemd-udevd[464]: Using default interface naming scheme 'rhel-9.0'.
Oct  7 15:12:44 np0005474864 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  7 15:12:44 np0005474864 systemd[1]: Starting dracut pre-trigger hook...
Oct  7 15:12:44 np0005474864 systemd[1]: Finished dracut pre-trigger hook.
Oct  7 15:12:44 np0005474864 systemd[1]: Starting Coldplug All udev Devices...
Oct  7 15:12:44 np0005474864 systemd[1]: Created slice Slice /system/modprobe.
Oct  7 15:12:44 np0005474864 systemd[1]: Starting Load Kernel Module configfs...
Oct  7 15:12:44 np0005474864 systemd[1]: Finished Coldplug All udev Devices.
Oct  7 15:12:44 np0005474864 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  7 15:12:44 np0005474864 systemd[1]: Finished Load Kernel Module configfs.
Oct  7 15:12:44 np0005474864 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  7 15:12:44 np0005474864 systemd[1]: Reached target Network.
Oct  7 15:12:44 np0005474864 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Oct  7 15:12:44 np0005474864 systemd[1]: Starting dracut initqueue hook...
Oct  7 15:12:44 np0005474864 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Oct  7 15:12:44 np0005474864 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Oct  7 15:12:44 np0005474864 kernel: vda: vda1
Oct  7 15:12:45 np0005474864 kernel: scsi host0: ata_piix
Oct  7 15:12:45 np0005474864 kernel: scsi host1: ata_piix
Oct  7 15:12:45 np0005474864 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Oct  7 15:12:45 np0005474864 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Oct  7 15:12:45 np0005474864 systemd[1]: Found device /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Initrd Root Device.
Oct  7 15:12:45 np0005474864 systemd[1]: Mounting Kernel Configuration File System...
Oct  7 15:12:45 np0005474864 kernel: ata1: found unknown device (class 0)
Oct  7 15:12:45 np0005474864 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Oct  7 15:12:45 np0005474864 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Oct  7 15:12:45 np0005474864 systemd[1]: Mounted Kernel Configuration File System.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target System Initialization.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Basic System.
Oct  7 15:12:45 np0005474864 systemd-udevd[477]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:12:45 np0005474864 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Oct  7 15:12:45 np0005474864 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Oct  7 15:12:45 np0005474864 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Oct  7 15:12:45 np0005474864 systemd[1]: Finished dracut initqueue hook.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Remote Encrypted Volumes.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Remote File Systems.
Oct  7 15:12:45 np0005474864 systemd[1]: Starting dracut pre-mount hook...
Oct  7 15:12:45 np0005474864 systemd[1]: Finished dracut pre-mount hook.
Oct  7 15:12:45 np0005474864 systemd[1]: Starting File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458...
Oct  7 15:12:45 np0005474864 systemd-fsck[559]: /usr/sbin/fsck.xfs: XFS file system.
Oct  7 15:12:45 np0005474864 systemd[1]: Finished File System Check on /dev/disk/by-uuid/1631a6ad-43b8-436d-ae76-16fa14b94458.
Oct  7 15:12:45 np0005474864 systemd[1]: Mounting /sysroot...
Oct  7 15:12:45 np0005474864 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Oct  7 15:12:45 np0005474864 kernel: XFS (vda1): Mounting V5 Filesystem 1631a6ad-43b8-436d-ae76-16fa14b94458
Oct  7 15:12:45 np0005474864 kernel: XFS (vda1): Ending clean mount
Oct  7 15:12:45 np0005474864 systemd[1]: Mounted /sysroot.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Initrd Root File System.
Oct  7 15:12:45 np0005474864 systemd[1]: Starting Mountpoints Configured in the Real Root...
Oct  7 15:12:45 np0005474864 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Oct  7 15:12:45 np0005474864 systemd[1]: Finished Mountpoints Configured in the Real Root.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Initrd File Systems.
Oct  7 15:12:45 np0005474864 systemd[1]: Reached target Initrd Default Target.
Oct  7 15:12:45 np0005474864 systemd[1]: Starting dracut mount hook...
Oct  7 15:12:45 np0005474864 systemd[1]: Finished dracut mount hook.
Oct  7 15:12:45 np0005474864 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Oct  7 15:12:46 np0005474864 rpc.idmapd[451]: exiting on signal 15
Oct  7 15:12:46 np0005474864 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Oct  7 15:12:46 np0005474864 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Network.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Remote Encrypted Volumes.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Timer Units.
Oct  7 15:12:46 np0005474864 systemd[1]: dbus.socket: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Closed D-Bus System Message Bus Socket.
Oct  7 15:12:46 np0005474864 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Initrd Default Target.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Basic System.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Initrd Root Device.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Initrd /usr File System.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Path Units.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Remote File Systems.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Preparation for Remote File Systems.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Slice Units.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Socket Units.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target System Initialization.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Local File Systems.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Swaps.
Oct  7 15:12:46 np0005474864 systemd[1]: dracut-mount.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped dracut mount hook.
Oct  7 15:12:46 np0005474864 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped dracut pre-mount hook.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped target Local Encrypted Volumes.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Oct  7 15:12:46 np0005474864 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped dracut initqueue hook.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Apply Kernel Variables.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Create Volatile Files and Directories.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Coldplug All udev Devices.
Oct  7 15:12:46 np0005474864 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped dracut pre-trigger hook.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Setup Virtual Console.
Oct  7 15:12:46 np0005474864 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-udevd.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Closed udev Control Socket.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Closed udev Kernel Socket.
Oct  7 15:12:46 np0005474864 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped dracut pre-udev hook.
Oct  7 15:12:46 np0005474864 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped dracut cmdline hook.
Oct  7 15:12:46 np0005474864 systemd[1]: Starting Cleanup udev Database...
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Create Static Device Nodes in /dev.
Oct  7 15:12:46 np0005474864 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Create List of Static Device Nodes.
Oct  7 15:12:46 np0005474864 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Stopped Create System Users.
Oct  7 15:12:46 np0005474864 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Oct  7 15:12:46 np0005474864 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Oct  7 15:12:46 np0005474864 systemd[1]: Finished Cleanup udev Database.
Oct  7 15:12:46 np0005474864 systemd[1]: Reached target Switch Root.
Oct  7 15:12:46 np0005474864 systemd[1]: Starting Switch Root...
Oct  7 15:12:46 np0005474864 systemd[1]: Switching root.
Oct  7 15:12:46 np0005474864 systemd-journald[308]: Received SIGTERM from PID 1 (systemd).
Oct  7 15:12:46 np0005474864 systemd-journald[308]: Journal stopped
Oct  7 15:12:47 np0005474864 kernel: audit: type=1404 audit(1759864366.413:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Oct  7 15:12:47 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:12:47 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:12:47 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:12:47 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:12:47 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:12:47 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:12:47 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:12:47 np0005474864 kernel: audit: type=1403 audit(1759864366.554:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Oct  7 15:12:47 np0005474864 systemd: Successfully loaded SELinux policy in 144.828ms.
Oct  7 15:12:47 np0005474864 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 29.541ms.
Oct  7 15:12:47 np0005474864 systemd: systemd 252-55.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Oct  7 15:12:47 np0005474864 systemd: Detected virtualization kvm.
Oct  7 15:12:47 np0005474864 systemd: Detected architecture x86-64.
Oct  7 15:12:47 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:12:47 np0005474864 systemd: initrd-switch-root.service: Deactivated successfully.
Oct  7 15:12:47 np0005474864 systemd: Stopped Switch Root.
Oct  7 15:12:47 np0005474864 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Oct  7 15:12:47 np0005474864 systemd: Created slice Slice /system/getty.
Oct  7 15:12:47 np0005474864 systemd: Created slice Slice /system/serial-getty.
Oct  7 15:12:47 np0005474864 systemd: Created slice Slice /system/sshd-keygen.
Oct  7 15:12:47 np0005474864 systemd: Created slice User and Session Slice.
Oct  7 15:12:47 np0005474864 systemd: Started Dispatch Password Requests to Console Directory Watch.
Oct  7 15:12:47 np0005474864 systemd: Started Forward Password Requests to Wall Directory Watch.
Oct  7 15:12:47 np0005474864 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Oct  7 15:12:47 np0005474864 systemd: Reached target Local Encrypted Volumes.
Oct  7 15:12:47 np0005474864 systemd: Stopped target Switch Root.
Oct  7 15:12:47 np0005474864 systemd: Stopped target Initrd File Systems.
Oct  7 15:12:47 np0005474864 systemd: Stopped target Initrd Root File System.
Oct  7 15:12:47 np0005474864 systemd: Reached target Local Integrity Protected Volumes.
Oct  7 15:12:47 np0005474864 systemd: Reached target Path Units.
Oct  7 15:12:47 np0005474864 systemd: Reached target rpc_pipefs.target.
Oct  7 15:12:47 np0005474864 systemd: Reached target Slice Units.
Oct  7 15:12:47 np0005474864 systemd: Reached target Swaps.
Oct  7 15:12:47 np0005474864 systemd: Reached target Local Verity Protected Volumes.
Oct  7 15:12:47 np0005474864 systemd: Listening on RPCbind Server Activation Socket.
Oct  7 15:12:47 np0005474864 systemd: Reached target RPC Port Mapper.
Oct  7 15:12:47 np0005474864 systemd: Listening on Process Core Dump Socket.
Oct  7 15:12:47 np0005474864 systemd: Listening on initctl Compatibility Named Pipe.
Oct  7 15:12:47 np0005474864 systemd: Listening on udev Control Socket.
Oct  7 15:12:47 np0005474864 systemd: Listening on udev Kernel Socket.
Oct  7 15:12:47 np0005474864 systemd: Mounting Huge Pages File System...
Oct  7 15:12:47 np0005474864 systemd: Mounting POSIX Message Queue File System...
Oct  7 15:12:47 np0005474864 systemd: Mounting Kernel Debug File System...
Oct  7 15:12:47 np0005474864 systemd: Mounting Kernel Trace File System...
Oct  7 15:12:47 np0005474864 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  7 15:12:47 np0005474864 systemd: Starting Create List of Static Device Nodes...
Oct  7 15:12:47 np0005474864 systemd: Starting Load Kernel Module configfs...
Oct  7 15:12:47 np0005474864 systemd: Starting Load Kernel Module drm...
Oct  7 15:12:47 np0005474864 systemd: Starting Load Kernel Module efi_pstore...
Oct  7 15:12:47 np0005474864 systemd: Starting Load Kernel Module fuse...
Oct  7 15:12:47 np0005474864 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Oct  7 15:12:47 np0005474864 systemd: systemd-fsck-root.service: Deactivated successfully.
Oct  7 15:12:47 np0005474864 systemd: Stopped File System Check on Root Device.
Oct  7 15:12:47 np0005474864 systemd: Stopped Journal Service.
Oct  7 15:12:47 np0005474864 systemd: Starting Journal Service...
Oct  7 15:12:47 np0005474864 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Oct  7 15:12:47 np0005474864 systemd: Starting Generate network units from Kernel command line...
Oct  7 15:12:47 np0005474864 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  7 15:12:47 np0005474864 systemd: Starting Remount Root and Kernel File Systems...
Oct  7 15:12:47 np0005474864 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Oct  7 15:12:47 np0005474864 kernel: fuse: init (API version 7.37)
Oct  7 15:12:47 np0005474864 systemd: Starting Apply Kernel Variables...
Oct  7 15:12:47 np0005474864 systemd: Starting Coldplug All udev Devices...
Oct  7 15:12:47 np0005474864 systemd: Mounted Huge Pages File System.
Oct  7 15:12:47 np0005474864 systemd: Mounted POSIX Message Queue File System.
Oct  7 15:12:47 np0005474864 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Oct  7 15:12:47 np0005474864 systemd: Mounted Kernel Debug File System.
Oct  7 15:12:47 np0005474864 systemd: Mounted Kernel Trace File System.
Oct  7 15:12:47 np0005474864 systemd-journald[682]: Journal started
Oct  7 15:12:47 np0005474864 systemd-journald[682]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  7 15:12:47 np0005474864 systemd: Finished Create List of Static Device Nodes.
Oct  7 15:12:47 np0005474864 systemd[1]: Queued start job for default target Multi-User System.
Oct  7 15:12:47 np0005474864 systemd[1]: systemd-journald.service: Deactivated successfully.
Oct  7 15:12:47 np0005474864 systemd: Started Journal Service.
Oct  7 15:12:47 np0005474864 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Load Kernel Module configfs.
Oct  7 15:12:47 np0005474864 kernel: ACPI: bus type drm_connector registered
Oct  7 15:12:47 np0005474864 systemd[1]: modprobe@drm.service: Deactivated successfully.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Load Kernel Module drm.
Oct  7 15:12:47 np0005474864 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Load Kernel Module efi_pstore.
Oct  7 15:12:47 np0005474864 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Load Kernel Module fuse.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Generate network units from Kernel command line.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Remount Root and Kernel File Systems.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Apply Kernel Variables.
Oct  7 15:12:47 np0005474864 systemd[1]: Mounting FUSE Control File System...
Oct  7 15:12:47 np0005474864 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Rebuild Hardware Database...
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Flush Journal to Persistent Storage...
Oct  7 15:12:47 np0005474864 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Load/Save OS Random Seed...
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Create System Users...
Oct  7 15:12:47 np0005474864 systemd[1]: Mounted FUSE Control File System.
Oct  7 15:12:47 np0005474864 systemd-journald[682]: Runtime Journal (/run/log/journal/42833e1b511a402df82cb9cb2fc36491) is 8.0M, max 153.5M, 145.5M free.
Oct  7 15:12:47 np0005474864 systemd-journald[682]: Received client request to flush runtime journal.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Flush Journal to Persistent Storage.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Coldplug All udev Devices.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Load/Save OS Random Seed.
Oct  7 15:12:47 np0005474864 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Create System Users.
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Create Static Device Nodes in /dev...
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Create Static Device Nodes in /dev.
Oct  7 15:12:47 np0005474864 systemd[1]: Reached target Preparation for Local File Systems.
Oct  7 15:12:47 np0005474864 systemd[1]: Reached target Local File Systems.
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Oct  7 15:12:47 np0005474864 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Oct  7 15:12:47 np0005474864 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Oct  7 15:12:47 np0005474864 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Automatic Boot Loader Update...
Oct  7 15:12:47 np0005474864 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Create Volatile Files and Directories...
Oct  7 15:12:47 np0005474864 bootctl[701]: Couldn't find EFI system partition, skipping.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Automatic Boot Loader Update.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Create Volatile Files and Directories.
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Security Auditing Service...
Oct  7 15:12:47 np0005474864 systemd[1]: Starting RPC Bind...
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Rebuild Journal Catalog...
Oct  7 15:12:47 np0005474864 auditd[707]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Oct  7 15:12:47 np0005474864 auditd[707]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Rebuild Journal Catalog.
Oct  7 15:12:47 np0005474864 systemd[1]: Started RPC Bind.
Oct  7 15:12:47 np0005474864 augenrules[712]: /sbin/augenrules: No change
Oct  7 15:12:47 np0005474864 augenrules[727]: No rules
Oct  7 15:12:47 np0005474864 augenrules[727]: enabled 1
Oct  7 15:12:47 np0005474864 augenrules[727]: failure 1
Oct  7 15:12:47 np0005474864 augenrules[727]: pid 707
Oct  7 15:12:47 np0005474864 augenrules[727]: rate_limit 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_limit 8192
Oct  7 15:12:47 np0005474864 augenrules[727]: lost 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_wait_time 60000
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_wait_time_actual 0
Oct  7 15:12:47 np0005474864 augenrules[727]: enabled 1
Oct  7 15:12:47 np0005474864 augenrules[727]: failure 1
Oct  7 15:12:47 np0005474864 augenrules[727]: pid 707
Oct  7 15:12:47 np0005474864 augenrules[727]: rate_limit 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_limit 8192
Oct  7 15:12:47 np0005474864 augenrules[727]: lost 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_wait_time 60000
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_wait_time_actual 0
Oct  7 15:12:47 np0005474864 augenrules[727]: enabled 1
Oct  7 15:12:47 np0005474864 augenrules[727]: failure 1
Oct  7 15:12:47 np0005474864 augenrules[727]: pid 707
Oct  7 15:12:47 np0005474864 augenrules[727]: rate_limit 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_limit 8192
Oct  7 15:12:47 np0005474864 augenrules[727]: lost 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog 0
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_wait_time 60000
Oct  7 15:12:47 np0005474864 augenrules[727]: backlog_wait_time_actual 0
Oct  7 15:12:47 np0005474864 systemd[1]: Started Security Auditing Service.
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Rebuild Hardware Database.
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Oct  7 15:12:47 np0005474864 systemd[1]: Starting Update is Completed...
Oct  7 15:12:47 np0005474864 systemd[1]: Finished Update is Completed.
Oct  7 15:12:48 np0005474864 systemd-udevd[735]: Using default interface naming scheme 'rhel-9.0'.
Oct  7 15:12:48 np0005474864 systemd[1]: Started Rule-based Manager for Device Events and Files.
Oct  7 15:12:48 np0005474864 systemd[1]: Reached target System Initialization.
Oct  7 15:12:48 np0005474864 systemd[1]: Started dnf makecache --timer.
Oct  7 15:12:48 np0005474864 systemd[1]: Started Daily rotation of log files.
Oct  7 15:12:48 np0005474864 systemd[1]: Started Daily Cleanup of Temporary Directories.
Oct  7 15:12:48 np0005474864 systemd[1]: Reached target Timer Units.
Oct  7 15:12:48 np0005474864 systemd[1]: Listening on D-Bus System Message Bus Socket.
Oct  7 15:12:48 np0005474864 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Oct  7 15:12:48 np0005474864 systemd[1]: Reached target Socket Units.
Oct  7 15:12:48 np0005474864 systemd[1]: Starting D-Bus System Message Bus...
Oct  7 15:12:48 np0005474864 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  7 15:12:48 np0005474864 systemd[1]: Starting Load Kernel Module configfs...
Oct  7 15:12:48 np0005474864 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Oct  7 15:12:48 np0005474864 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Oct  7 15:12:48 np0005474864 systemd[1]: Finished Load Kernel Module configfs.
Oct  7 15:12:48 np0005474864 systemd-udevd[746]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:12:48 np0005474864 systemd[1]: Started D-Bus System Message Bus.
Oct  7 15:12:48 np0005474864 systemd[1]: Reached target Basic System.
Oct  7 15:12:48 np0005474864 dbus-broker-lau[766]: Ready
Oct  7 15:12:48 np0005474864 systemd[1]: Starting NTP client/server...
Oct  7 15:12:48 np0005474864 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Oct  7 15:12:48 np0005474864 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Oct  7 15:12:48 np0005474864 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Oct  7 15:12:48 np0005474864 systemd[1]: Starting Restore /run/initramfs on shutdown...
Oct  7 15:12:48 np0005474864 systemd[1]: Starting IPv4 firewall with iptables...
Oct  7 15:12:48 np0005474864 chronyd[792]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  7 15:12:48 np0005474864 chronyd[792]: Loaded 0 symmetric keys
Oct  7 15:12:48 np0005474864 chronyd[792]: Using right/UTC timezone to obtain leap second data
Oct  7 15:12:48 np0005474864 chronyd[792]: Loaded seccomp filter (level 2)
Oct  7 15:12:48 np0005474864 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Oct  7 15:12:48 np0005474864 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Oct  7 15:12:48 np0005474864 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Oct  7 15:12:48 np0005474864 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Oct  7 15:12:48 np0005474864 systemd[1]: Started irqbalance daemon.
Oct  7 15:12:48 np0005474864 kernel: Console: switching to colour dummy device 80x25
Oct  7 15:12:48 np0005474864 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Oct  7 15:12:48 np0005474864 kernel: [drm] features: -context_init
Oct  7 15:12:48 np0005474864 kernel: [drm] number of scanouts: 1
Oct  7 15:12:48 np0005474864 kernel: [drm] number of cap sets: 0
Oct  7 15:12:48 np0005474864 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Oct  7 15:12:48 np0005474864 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  7 15:12:48 np0005474864 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  7 15:12:48 np0005474864 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  7 15:12:48 np0005474864 systemd[1]: Reached target sshd-keygen.target.
Oct  7 15:12:48 np0005474864 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Oct  7 15:12:48 np0005474864 systemd[1]: Reached target User and Group Name Lookups.
Oct  7 15:12:48 np0005474864 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Oct  7 15:12:48 np0005474864 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Oct  7 15:12:48 np0005474864 kernel: Console: switching to colour frame buffer device 128x48
Oct  7 15:12:48 np0005474864 systemd[1]: Starting User Login Management...
Oct  7 15:12:48 np0005474864 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Oct  7 15:12:48 np0005474864 systemd[1]: Started NTP client/server.
Oct  7 15:12:48 np0005474864 systemd[1]: Finished Restore /run/initramfs on shutdown.
Oct  7 15:12:48 np0005474864 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Oct  7 15:12:48 np0005474864 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Oct  7 15:12:48 np0005474864 systemd-logind[805]: New seat seat0.
Oct  7 15:12:48 np0005474864 systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  7 15:12:48 np0005474864 systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  7 15:12:48 np0005474864 systemd[1]: Started User Login Management.
Oct  7 15:12:48 np0005474864 kernel: kvm_amd: TSC scaling supported
Oct  7 15:12:48 np0005474864 kernel: kvm_amd: Nested Virtualization enabled
Oct  7 15:12:48 np0005474864 kernel: kvm_amd: Nested Paging enabled
Oct  7 15:12:48 np0005474864 kernel: kvm_amd: LBR virtualization supported
Oct  7 15:12:48 np0005474864 iptables.init[784]: iptables: Applying firewall rules: [  OK  ]
Oct  7 15:12:48 np0005474864 systemd[1]: Finished IPv4 firewall with iptables.
Oct  7 15:12:48 np0005474864 cloud-init[844]: Cloud-init v. 24.4-7.el9 running 'init-local' at Tue, 07 Oct 2025 19:12:48 +0000. Up 6.48 seconds.
Oct  7 15:12:49 np0005474864 systemd[1]: run-cloud\x2dinit-tmp-tmpf_7k76ex.mount: Deactivated successfully.
Oct  7 15:12:49 np0005474864 systemd[1]: Starting Hostname Service...
Oct  7 15:12:49 np0005474864 systemd[1]: Started Hostname Service.
Oct  7 15:12:49 np0005474864 systemd-hostnamed[858]: Hostname set to <np0005474864.novalocal> (static)
Oct  7 15:12:49 np0005474864 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Oct  7 15:12:49 np0005474864 systemd[1]: Reached target Preparation for Network.
Oct  7 15:12:49 np0005474864 systemd[1]: Starting Network Manager...
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5376] NetworkManager (version 1.54.1-1.el9) is starting... (boot:6a778c83-97cd-4db9-828a-f91a822d201d)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5380] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5505] manager[0x5617908bb080]: monitoring kernel firmware directory '/lib/firmware'.
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5549] hostname: hostname: using hostnamed
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5549] hostname: static hostname changed from (none) to "np0005474864.novalocal"
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5552] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5655] manager[0x5617908bb080]: rfkill: Wi-Fi hardware radio set enabled
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5656] manager[0x5617908bb080]: rfkill: WWAN hardware radio set enabled
Oct  7 15:12:49 np0005474864 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5732] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5733] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5734] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5735] manager: Networking is enabled by state file
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5737] settings: Loaded settings plugin: keyfile (internal)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5769] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5796] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5822] dhcp: init: Using DHCP client 'internal'
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5825] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5840] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5853] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5861] device (lo): Activation: starting connection 'lo' (7186dd46-7bf7-4d5a-893f-437c9f730689)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5870] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5872] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5897] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5901] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5903] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5905] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5907] device (eth0): carrier: link connected
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5910] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  7 15:12:49 np0005474864 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5915] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5928] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5932] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  7 15:12:49 np0005474864 systemd[1]: Started Network Manager.
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5933] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5934] manager: NetworkManager state is now CONNECTING
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5935] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:12:49 np0005474864 systemd[1]: Reached target Network.
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5951] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.5954] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:12:49 np0005474864 systemd[1]: Starting Network Manager Wait Online...
Oct  7 15:12:49 np0005474864 systemd[1]: Starting GSSAPI Proxy Daemon...
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6012] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6019] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6036] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:12:49 np0005474864 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6114] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6115] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6117] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6123] device (lo): Activation: successful, device activated.
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6127] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6132] manager: NetworkManager state is now CONNECTED_SITE
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6134] device (eth0): Activation: successful, device activated.
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6137] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  7 15:12:49 np0005474864 NetworkManager[862]: <info>  [1759864369.6142] manager: startup complete
Oct  7 15:12:49 np0005474864 systemd[1]: Started GSSAPI Proxy Daemon.
Oct  7 15:12:49 np0005474864 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Oct  7 15:12:49 np0005474864 systemd[1]: Reached target NFS client services.
Oct  7 15:12:49 np0005474864 systemd[1]: Reached target Preparation for Remote File Systems.
Oct  7 15:12:49 np0005474864 systemd[1]: Reached target Remote File Systems.
Oct  7 15:12:49 np0005474864 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Oct  7 15:12:49 np0005474864 systemd[1]: Finished Network Manager Wait Online.
Oct  7 15:12:49 np0005474864 systemd[1]: Starting Cloud-init: Network Stage...
Oct  7 15:12:49 np0005474864 cloud-init[927]: Cloud-init v. 24.4-7.el9 running 'init' at Tue, 07 Oct 2025 19:12:49 +0000. Up 7.51 seconds.
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |  eth0  | True |        38.102.83.243         | 255.255.255.0 | global | fa:16:3e:5c:ce:8e |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |  eth0  | True | fe80::f816:3eff:fe5c:ce8e/64 |       .       |  link  | fa:16:3e:5c:ce:8e |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Oct  7 15:12:49 np0005474864 cloud-init[927]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Oct  7 15:12:50 np0005474864 cloud-init[927]: ci-info: +-------+-------------+---------+-----------+-------+
Oct  7 15:12:51 np0005474864 cloud-init[927]: Generating public/private rsa key pair.
Oct  7 15:12:51 np0005474864 cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Oct  7 15:12:51 np0005474864 cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Oct  7 15:12:51 np0005474864 cloud-init[927]: The key fingerprint is:
Oct  7 15:12:51 np0005474864 cloud-init[927]: SHA256:t/4QbRjUUuf0PLFNzaRK57cRYEVIdyqEvih/ZfRrY/0 root@np0005474864.novalocal
Oct  7 15:12:51 np0005474864 cloud-init[927]: The key's randomart image is:
Oct  7 15:12:51 np0005474864 cloud-init[927]: +---[RSA 3072]----+
Oct  7 15:12:51 np0005474864 cloud-init[927]: |          .+++*B=|
Oct  7 15:12:51 np0005474864 cloud-init[927]: |         .o.o=o*B|
Oct  7 15:12:51 np0005474864 cloud-init[927]: |         ...o =++|
Oct  7 15:12:51 np0005474864 cloud-init[927]: |          .=.=  o|
Oct  7 15:12:51 np0005474864 cloud-init[927]: |        S.+o+....|
Oct  7 15:12:51 np0005474864 cloud-init[927]: |      . ...+o ..o|
Oct  7 15:12:51 np0005474864 cloud-init[927]: |       o  oo   + |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |        ....  = .|
Oct  7 15:12:51 np0005474864 cloud-init[927]: |         ....o .E|
Oct  7 15:12:51 np0005474864 cloud-init[927]: +----[SHA256]-----+
Oct  7 15:12:51 np0005474864 cloud-init[927]: Generating public/private ecdsa key pair.
Oct  7 15:12:51 np0005474864 cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Oct  7 15:12:51 np0005474864 cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Oct  7 15:12:51 np0005474864 cloud-init[927]: The key fingerprint is:
Oct  7 15:12:51 np0005474864 cloud-init[927]: SHA256:Qtuw1rwqiTpLlx1C1MqanuMV+8IJjea35wR82jKAbkg root@np0005474864.novalocal
Oct  7 15:12:51 np0005474864 cloud-init[927]: The key's randomart image is:
Oct  7 15:12:51 np0005474864 cloud-init[927]: +---[ECDSA 256]---+
Oct  7 15:12:51 np0005474864 cloud-init[927]: |   ..            |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |  .  .           |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |  ... o          |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |. oo . B         |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |.E== o= S        |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |+*..@... .       |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |=++X.=  .        |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |+=+=B. .         |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |+=o.=+.          |
Oct  7 15:12:51 np0005474864 cloud-init[927]: +----[SHA256]-----+
Oct  7 15:12:51 np0005474864 cloud-init[927]: Generating public/private ed25519 key pair.
Oct  7 15:12:51 np0005474864 cloud-init[927]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Oct  7 15:12:51 np0005474864 cloud-init[927]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Oct  7 15:12:51 np0005474864 cloud-init[927]: The key fingerprint is:
Oct  7 15:12:51 np0005474864 cloud-init[927]: SHA256:OAtRHXsHbA/Itxyy4hB/kzIMVoS7RDIa7fENW318QJA root@np0005474864.novalocal
Oct  7 15:12:51 np0005474864 cloud-init[927]: The key's randomart image is:
Oct  7 15:12:51 np0005474864 cloud-init[927]: +--[ED25519 256]--+
Oct  7 15:12:51 np0005474864 cloud-init[927]: | .   +++oO+.     |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |. = B.. E.O..    |
Oct  7 15:12:51 np0005474864 cloud-init[927]: | + B.@  .O.*.    |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |. . *.B.=.o..    |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |   ..+o=S.       |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |    ...o         |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |      .          |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |                 |
Oct  7 15:12:51 np0005474864 cloud-init[927]: |                 |
Oct  7 15:12:51 np0005474864 cloud-init[927]: +----[SHA256]-----+
Oct  7 15:12:51 np0005474864 systemd[1]: Finished Cloud-init: Network Stage.
Oct  7 15:12:51 np0005474864 systemd[1]: Reached target Cloud-config availability.
Oct  7 15:12:51 np0005474864 systemd[1]: Reached target Network is Online.
Oct  7 15:12:51 np0005474864 systemd[1]: Starting Cloud-init: Config Stage...
Oct  7 15:12:51 np0005474864 systemd[1]: Starting Notify NFS peers of a restart...
Oct  7 15:12:51 np0005474864 systemd[1]: Starting System Logging Service...
Oct  7 15:12:51 np0005474864 sm-notify[1009]: Version 2.5.4 starting
Oct  7 15:12:51 np0005474864 systemd[1]: Starting OpenSSH server daemon...
Oct  7 15:12:51 np0005474864 systemd[1]: Starting Permit User Sessions...
Oct  7 15:12:51 np0005474864 systemd[1]: Started Notify NFS peers of a restart.
Oct  7 15:12:51 np0005474864 systemd[1]: Finished Permit User Sessions.
Oct  7 15:12:51 np0005474864 systemd[1]: Started OpenSSH server daemon.
Oct  7 15:12:51 np0005474864 systemd[1]: Started Command Scheduler.
Oct  7 15:12:51 np0005474864 systemd[1]: Started Getty on tty1.
Oct  7 15:12:51 np0005474864 systemd[1]: Started Serial Getty on ttyS0.
Oct  7 15:12:51 np0005474864 systemd[1]: Reached target Login Prompts.
Oct  7 15:12:51 np0005474864 rsyslogd[1010]: [origin software="rsyslogd" swVersion="8.2506.0-2.el9" x-pid="1010" x-info="https://www.rsyslog.com"] start
Oct  7 15:12:51 np0005474864 systemd[1]: Started System Logging Service.
Oct  7 15:12:51 np0005474864 rsyslogd[1010]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2506.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Oct  7 15:12:51 np0005474864 systemd[1]: Reached target Multi-User System.
Oct  7 15:12:51 np0005474864 systemd[1]: Starting Record Runlevel Change in UTMP...
Oct  7 15:12:52 np0005474864 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Oct  7 15:12:52 np0005474864 systemd[1]: Finished Record Runlevel Change in UTMP.
Oct  7 15:12:52 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 15:12:52 np0005474864 cloud-init[1041]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Tue, 07 Oct 2025 19:12:52 +0000. Up 9.75 seconds.
Oct  7 15:12:52 np0005474864 systemd[1]: Finished Cloud-init: Config Stage.
Oct  7 15:12:52 np0005474864 systemd[1]: Starting Cloud-init: Final Stage...
Oct  7 15:12:52 np0005474864 cloud-init[1045]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Tue, 07 Oct 2025 19:12:52 +0000. Up 10.13 seconds.
Oct  7 15:12:52 np0005474864 cloud-init[1047]: #############################################################
Oct  7 15:12:52 np0005474864 cloud-init[1048]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Oct  7 15:12:52 np0005474864 cloud-init[1050]: 256 SHA256:Qtuw1rwqiTpLlx1C1MqanuMV+8IJjea35wR82jKAbkg root@np0005474864.novalocal (ECDSA)
Oct  7 15:12:52 np0005474864 cloud-init[1052]: 256 SHA256:OAtRHXsHbA/Itxyy4hB/kzIMVoS7RDIa7fENW318QJA root@np0005474864.novalocal (ED25519)
Oct  7 15:12:52 np0005474864 cloud-init[1054]: 3072 SHA256:t/4QbRjUUuf0PLFNzaRK57cRYEVIdyqEvih/ZfRrY/0 root@np0005474864.novalocal (RSA)
Oct  7 15:12:52 np0005474864 cloud-init[1055]: -----END SSH HOST KEY FINGERPRINTS-----
Oct  7 15:12:52 np0005474864 cloud-init[1056]: #############################################################
Oct  7 15:12:52 np0005474864 cloud-init[1045]: Cloud-init v. 24.4-7.el9 finished at Tue, 07 Oct 2025 19:12:52 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 10.33 seconds
Oct  7 15:12:52 np0005474864 systemd[1]: Finished Cloud-init: Final Stage.
Oct  7 15:12:52 np0005474864 systemd[1]: Reached target Cloud-init target.
Oct  7 15:12:52 np0005474864 systemd[1]: Startup finished in 1.518s (kernel) + 2.479s (initrd) + 6.392s (userspace) = 10.390s.
Oct  7 15:12:56 np0005474864 chronyd[792]: Selected source 54.39.23.64 (2.centos.pool.ntp.org)
Oct  7 15:12:56 np0005474864 chronyd[792]: System clock wrong by 1.645432 seconds
Oct  7 15:12:56 np0005474864 chronyd[792]: System clock was stepped by 1.645432 seconds
Oct  7 15:12:56 np0005474864 chronyd[792]: System clock TAI offset set to 37 seconds
Oct  7 15:13:00 np0005474864 irqbalance[796]: Cannot change IRQ 25 affinity: Operation not permitted
Oct  7 15:13:00 np0005474864 irqbalance[796]: IRQ 25 affinity is now unmanaged
Oct  7 15:13:00 np0005474864 irqbalance[796]: Cannot change IRQ 31 affinity: Operation not permitted
Oct  7 15:13:00 np0005474864 irqbalance[796]: IRQ 31 affinity is now unmanaged
Oct  7 15:13:00 np0005474864 irqbalance[796]: Cannot change IRQ 28 affinity: Operation not permitted
Oct  7 15:13:00 np0005474864 irqbalance[796]: IRQ 28 affinity is now unmanaged
Oct  7 15:13:00 np0005474864 irqbalance[796]: Cannot change IRQ 32 affinity: Operation not permitted
Oct  7 15:13:00 np0005474864 irqbalance[796]: IRQ 32 affinity is now unmanaged
Oct  7 15:13:00 np0005474864 irqbalance[796]: Cannot change IRQ 30 affinity: Operation not permitted
Oct  7 15:13:00 np0005474864 irqbalance[796]: IRQ 30 affinity is now unmanaged
Oct  7 15:13:00 np0005474864 irqbalance[796]: Cannot change IRQ 29 affinity: Operation not permitted
Oct  7 15:13:00 np0005474864 irqbalance[796]: IRQ 29 affinity is now unmanaged
Oct  7 15:13:01 np0005474864 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  7 15:13:10 np0005474864 systemd[1]: Created slice User Slice of UID 1000.
Oct  7 15:13:10 np0005474864 systemd[1]: Starting User Runtime Directory /run/user/1000...
Oct  7 15:13:10 np0005474864 systemd-logind[805]: New session 1 of user zuul.
Oct  7 15:13:10 np0005474864 systemd[1]: Finished User Runtime Directory /run/user/1000.
Oct  7 15:13:10 np0005474864 systemd[1]: Starting User Manager for UID 1000...
Oct  7 15:13:10 np0005474864 systemd[1064]: Queued start job for default target Main User Target.
Oct  7 15:13:10 np0005474864 systemd[1064]: Created slice User Application Slice.
Oct  7 15:13:10 np0005474864 systemd[1064]: Started Mark boot as successful after the user session has run 2 minutes.
Oct  7 15:13:10 np0005474864 systemd[1064]: Started Daily Cleanup of User's Temporary Directories.
Oct  7 15:13:10 np0005474864 systemd[1064]: Reached target Paths.
Oct  7 15:13:10 np0005474864 systemd[1064]: Reached target Timers.
Oct  7 15:13:10 np0005474864 systemd[1064]: Starting D-Bus User Message Bus Socket...
Oct  7 15:13:10 np0005474864 systemd[1064]: Starting Create User's Volatile Files and Directories...
Oct  7 15:13:10 np0005474864 systemd[1064]: Listening on D-Bus User Message Bus Socket.
Oct  7 15:13:10 np0005474864 systemd[1064]: Reached target Sockets.
Oct  7 15:13:10 np0005474864 systemd[1064]: Finished Create User's Volatile Files and Directories.
Oct  7 15:13:10 np0005474864 systemd[1064]: Reached target Basic System.
Oct  7 15:13:10 np0005474864 systemd[1064]: Reached target Main User Target.
Oct  7 15:13:10 np0005474864 systemd[1064]: Startup finished in 155ms.
Oct  7 15:13:10 np0005474864 systemd[1]: Started User Manager for UID 1000.
Oct  7 15:13:10 np0005474864 systemd[1]: Started Session 1 of User zuul.
Oct  7 15:13:11 np0005474864 python3[1146]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:13:15 np0005474864 python3[1174]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:13:21 np0005474864 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  7 15:13:21 np0005474864 python3[1234]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:13:22 np0005474864 python3[1274]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Oct  7 15:13:25 np0005474864 python3[1300]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC5auL++hCqXQRub8SQVxhJfZhJhrYtbWUgbjI36pfXg9cn/OzbdWkJsRt5ROPgljgEig/7Eh4H3o9cWdfeMA1Z8ufm8RqSYDok1ro3nfBKXXEImwiuQOJC4nfpKi/ZThxbj/Q6ABIxRH46J4WlN+/hUDQuiVQ5UmYHoRejpRQF+t8D/1ndvt6FpkWQUllgaC/VqXM+PO4AKlDyV1nxXn040HLnlWvXLCMHzUONfvPRmKfzl2U1Nwrww59KGH890Lzv2kR6JCWscholOCaEYxOHRzI2UoGI7XKPoB8KyJHmQ/6BUDDFKVRN8ryHZ+6f07hDo2aBXB7azPtqOVWMTFKGK9jjKinoItHOwdAIwr4jTu76w0yfGgzdwnuQK6mTzuK/VfcHDfCOVrZ8ZIM8LHWIwYnPhQb1qH1L5ukd/6GbSzQtLaTOH7E/FR4YTPsZJLJYF9ucUzjT2fZ3CeM6xrsvVcYyNGisZ+CzMMPNnabcMgGAId2UzIKDA+xDH60Kboc= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:25 np0005474864 python3[1324]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:26 np0005474864 python3[1423]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:13:26 np0005474864 python3[1494]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759864405.902083-254-269276537705469/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=6740eee73c8b40449e9a44f1cca2648c_id_rsa follow=False checksum=4930a67bb48cf35d2bcc47fbadc5dc6fae9aed8c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:27 np0005474864 python3[1617]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:13:27 np0005474864 python3[1688]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759864406.910784-309-4236721585249/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=6740eee73c8b40449e9a44f1cca2648c_id_rsa.pub follow=False checksum=324ac243a5597caf69cbbbc970c99595d568983e backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:28 np0005474864 python3[1736]: ansible-ping Invoked with data=pong
Oct  7 15:13:29 np0005474864 python3[1760]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:13:32 np0005474864 python3[1818]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Oct  7 15:13:34 np0005474864 python3[1850]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:34 np0005474864 python3[1874]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:34 np0005474864 python3[1898]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:34 np0005474864 python3[1922]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:35 np0005474864 python3[1946]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:35 np0005474864 python3[1970]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:37 np0005474864 python3[1996]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:37 np0005474864 python3[2074]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:13:38 np0005474864 python3[2147]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759864417.5284312-34-126827173278420/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:39 np0005474864 python3[2195]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:39 np0005474864 python3[2219]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:39 np0005474864 python3[2243]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:39 np0005474864 python3[2267]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:40 np0005474864 python3[2291]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:40 np0005474864 python3[2315]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:40 np0005474864 python3[2339]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:41 np0005474864 python3[2363]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:41 np0005474864 python3[2387]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:41 np0005474864 python3[2411]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:41 np0005474864 python3[2435]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:42 np0005474864 python3[2459]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:42 np0005474864 python3[2483]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:42 np0005474864 python3[2507]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:43 np0005474864 python3[2531]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:43 np0005474864 python3[2555]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:43 np0005474864 python3[2579]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:43 np0005474864 python3[2603]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:44 np0005474864 python3[2627]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:44 np0005474864 python3[2651]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:44 np0005474864 python3[2675]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:45 np0005474864 python3[2699]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:45 np0005474864 python3[2723]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:45 np0005474864 python3[2747]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:45 np0005474864 python3[2771]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:46 np0005474864 python3[2795]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:13:48 np0005474864 python3[2821]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  7 15:13:48 np0005474864 systemd[1]: Starting Time & Date Service...
Oct  7 15:13:48 np0005474864 systemd[1]: Started Time & Date Service.
Oct  7 15:13:49 np0005474864 systemd-timedated[2823]: Changed time zone to 'UTC' (UTC).
Oct  7 15:13:49 np0005474864 python3[2853]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:49 np0005474864 python3[2929]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:13:50 np0005474864 python3[3000]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1759864429.6642962-253-139845909668071/source _original_basename=tmpq82z334g follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:50 np0005474864 python3[3100]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:13:51 np0005474864 python3[3171]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759864430.574317-304-260285705735806/source _original_basename=tmpzbzqta9f follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:52 np0005474864 python3[3273]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:13:52 np0005474864 python3[3346]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1759864431.876299-384-178001480616654/source _original_basename=tmpm28z0wmv follow=False checksum=ee9b126fe33e72500e994fd3e8d9deaa54707872 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:53 np0005474864 python3[3394]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:13:53 np0005474864 python3[3420]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:13:53 np0005474864 python3[3500]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:13:54 np0005474864 python3[3573]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1759864433.7071626-454-268588440501530/source _original_basename=tmpqkpmp4tt follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:13:55 np0005474864 python3[3624]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-f3da-5619-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:13:55 np0005474864 python3[3652]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-f3da-5619-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Oct  7 15:13:57 np0005474864 python3[3681]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:14:00 np0005474864 irqbalance[796]: Cannot change IRQ 26 affinity: Operation not permitted
Oct  7 15:14:00 np0005474864 irqbalance[796]: IRQ 26 affinity is now unmanaged
Oct  7 15:14:16 np0005474864 python3[3709]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:14:19 np0005474864 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  7 15:15:16 np0005474864 systemd-logind[805]: Session 1 logged out. Waiting for processes to exit.
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Oct  7 15:15:37 np0005474864 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Oct  7 15:15:37 np0005474864 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9481] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  7 15:15:37 np0005474864 systemd-udevd[3712]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9659] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9682] settings: (eth1): created default wired connection 'Wired connection 1'
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9685] device (eth1): carrier: link connected
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9687] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9692] policy: auto-activating connection 'Wired connection 1' (9410294d-aa1e-3bb4-8bdc-9291d424595c)
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9695] device (eth1): Activation: starting connection 'Wired connection 1' (9410294d-aa1e-3bb4-8bdc-9291d424595c)
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9696] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9698] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9702] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:15:37 np0005474864 NetworkManager[862]: <info>  [1759864537.9706] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:15:37 np0005474864 systemd[1064]: Starting Mark boot as successful...
Oct  7 15:15:37 np0005474864 systemd[1064]: Finished Mark boot as successful.
Oct  7 15:15:38 np0005474864 systemd-logind[805]: New session 3 of user zuul.
Oct  7 15:15:38 np0005474864 systemd[1]: Started Session 3 of User zuul.
Oct  7 15:15:39 np0005474864 python3[3744]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-e018-61a1-0000000001ea-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:15:46 np0005474864 python3[3824]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:15:46 np0005474864 python3[3897]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759864545.7743144-206-227519283042828/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=25f3748474087b046ee05bfca193ae399e9d9261 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:15:46 np0005474864 python3[3947]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:15:47 np0005474864 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  7 15:15:47 np0005474864 systemd[1]: Stopped Network Manager Wait Online.
Oct  7 15:15:47 np0005474864 systemd[1]: Stopping Network Manager Wait Online...
Oct  7 15:15:47 np0005474864 systemd[1]: Stopping Network Manager...
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0099] caught SIGTERM, shutting down normally.
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0113] dhcp4 (eth0): canceled DHCP transaction
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0114] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0114] dhcp4 (eth0): state changed no lease
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0117] manager: NetworkManager state is now CONNECTING
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0253] dhcp4 (eth1): canceled DHCP transaction
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0253] dhcp4 (eth1): state changed no lease
Oct  7 15:15:47 np0005474864 NetworkManager[862]: <info>  [1759864547.0290] exiting (success)
Oct  7 15:15:47 np0005474864 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  7 15:15:47 np0005474864 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  7 15:15:47 np0005474864 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  7 15:15:47 np0005474864 systemd[1]: Stopped Network Manager.
Oct  7 15:15:47 np0005474864 systemd[1]: NetworkManager.service: Consumed 1.081s CPU time, 10.0M memory peak.
Oct  7 15:15:47 np0005474864 systemd[1]: Starting Network Manager...
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.0989] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:6a778c83-97cd-4db9-828a-f91a822d201d)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.0994] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1065] manager[0x55f9fb494070]: monitoring kernel firmware directory '/lib/firmware'.
Oct  7 15:15:47 np0005474864 systemd[1]: Starting Hostname Service...
Oct  7 15:15:47 np0005474864 systemd[1]: Started Hostname Service.
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1780] hostname: hostname: using hostnamed
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1781] hostname: static hostname changed from (none) to "np0005474864.novalocal"
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1791] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1799] manager[0x55f9fb494070]: rfkill: Wi-Fi hardware radio set enabled
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1800] manager[0x55f9fb494070]: rfkill: WWAN hardware radio set enabled
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1849] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1850] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1851] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1852] manager: Networking is enabled by state file
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1856] settings: Loaded settings plugin: keyfile (internal)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1863] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1907] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1925] dhcp: init: Using DHCP client 'internal'
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1929] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1938] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1948] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1962] device (lo): Activation: starting connection 'lo' (7186dd46-7bf7-4d5a-893f-437c9f730689)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1973] device (eth0): carrier: link connected
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1980] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1990] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.1990] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2002] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2015] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2028] device (eth1): carrier: link connected
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2035] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2045] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (9410294d-aa1e-3bb4-8bdc-9291d424595c) (indicated)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2045] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2055] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2067] device (eth1): Activation: starting connection 'Wired connection 1' (9410294d-aa1e-3bb4-8bdc-9291d424595c)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2075] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  7 15:15:47 np0005474864 systemd[1]: Started Network Manager.
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2083] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2088] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2092] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2095] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2102] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2109] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2116] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2122] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2137] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2141] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2160] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2166] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2192] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  7 15:15:47 np0005474864 systemd[1]: Starting Network Manager Wait Online...
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2200] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2209] device (lo): Activation: successful, device activated.
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2221] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2232] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2333] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2349] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2353] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2358] manager: NetworkManager state is now CONNECTED_SITE
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2365] device (eth0): Activation: successful, device activated.
Oct  7 15:15:47 np0005474864 NetworkManager[3955]: <info>  [1759864547.2373] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  7 15:15:47 np0005474864 python3[4031]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-e018-61a1-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:15:57 np0005474864 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  7 15:16:17 np0005474864 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.0510] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  7 15:16:32 np0005474864 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  7 15:16:32 np0005474864 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.0855] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.0858] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.0865] device (eth1): Activation: successful, device activated.
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.0871] manager: startup complete
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.0873] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <warn>  [1759864592.0877] device (eth1): Activation: failed for connection 'Wired connection 1'
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.0883] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Oct  7 15:16:32 np0005474864 systemd[1]: Finished Network Manager Wait Online.
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1043] dhcp4 (eth1): canceled DHCP transaction
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1044] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1044] dhcp4 (eth1): state changed no lease
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1067] policy: auto-activating connection 'ci-private-network' (ec84c6d0-c5d8-5fc3-a61d-f32b4823d873)
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1074] device (eth1): Activation: starting connection 'ci-private-network' (ec84c6d0-c5d8-5fc3-a61d-f32b4823d873)
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1076] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1081] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1093] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1106] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1196] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1197] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:16:32 np0005474864 NetworkManager[3955]: <info>  [1759864592.1204] device (eth1): Activation: successful, device activated.
Oct  7 15:16:42 np0005474864 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  7 15:16:47 np0005474864 systemd[1]: session-3.scope: Deactivated successfully.
Oct  7 15:16:47 np0005474864 systemd[1]: session-3.scope: Consumed 1.618s CPU time.
Oct  7 15:16:47 np0005474864 systemd-logind[805]: Session 3 logged out. Waiting for processes to exit.
Oct  7 15:16:47 np0005474864 systemd-logind[805]: Removed session 3.
Oct  7 15:16:57 np0005474864 systemd-logind[805]: New session 4 of user zuul.
Oct  7 15:16:57 np0005474864 systemd[1]: Started Session 4 of User zuul.
Oct  7 15:16:58 np0005474864 python3[4140]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:16:58 np0005474864 python3[4213]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759864617.9103968-365-267932168007064/source _original_basename=tmp2jrdlxap follow=False checksum=fdd1140b691586047f68ed609442efe2a7006f8d backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:17:00 np0005474864 systemd[1]: session-4.scope: Deactivated successfully.
Oct  7 15:17:00 np0005474864 systemd-logind[805]: Session 4 logged out. Waiting for processes to exit.
Oct  7 15:17:00 np0005474864 systemd-logind[805]: Removed session 4.
Oct  7 15:18:39 np0005474864 systemd[1064]: Created slice User Background Tasks Slice.
Oct  7 15:18:39 np0005474864 systemd[1064]: Starting Cleanup of User's Temporary Files and Directories...
Oct  7 15:18:39 np0005474864 systemd[1064]: Finished Cleanup of User's Temporary Files and Directories.
Oct  7 15:22:19 np0005474864 systemd-logind[805]: New session 5 of user zuul.
Oct  7 15:22:19 np0005474864 systemd[1]: Started Session 5 of User zuul.
Oct  7 15:22:19 np0005474864 python3[4282]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-0dec-a54f-000000001cf8-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:22:20 np0005474864 python3[4310]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:22:20 np0005474864 python3[4336]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:22:20 np0005474864 python3[4363]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:22:20 np0005474864 python3[4389]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:22:21 np0005474864 python3[4415]: ansible-ansible.builtin.lineinfile Invoked with path=/etc/systemd/system.conf regexp=^#DefaultIOAccounting=no line=DefaultIOAccounting=yes state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:22:21 np0005474864 python3[4415]: ansible-ansible.builtin.lineinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Oct  7 15:22:22 np0005474864 python3[4441]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 15:22:22 np0005474864 systemd[1]: Reloading.
Oct  7 15:22:22 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:22:23 np0005474864 python3[4497]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Oct  7 15:22:24 np0005474864 python3[4523]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:22:24 np0005474864 python3[4551]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:22:25 np0005474864 python3[4579]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:22:25 np0005474864 python3[4607]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:22:25 np0005474864 python3[4634]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-0dec-a54f-000000001cfe-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:22:26 np0005474864 python3[4664]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:22:29 np0005474864 systemd[1]: session-5.scope: Deactivated successfully.
Oct  7 15:22:29 np0005474864 systemd[1]: session-5.scope: Consumed 3.330s CPU time.
Oct  7 15:22:29 np0005474864 systemd-logind[805]: Session 5 logged out. Waiting for processes to exit.
Oct  7 15:22:29 np0005474864 systemd-logind[805]: Removed session 5.
Oct  7 15:22:30 np0005474864 systemd-logind[805]: New session 6 of user zuul.
Oct  7 15:22:30 np0005474864 systemd[1]: Started Session 6 of User zuul.
Oct  7 15:22:31 np0005474864 python3[4697]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Oct  7 15:22:46 np0005474864 kernel: SELinux:  Converting 363 SID table entries...
Oct  7 15:22:46 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:22:46 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:22:46 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:22:46 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:22:46 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:22:46 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:22:46 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:22:55 np0005474864 kernel: SELinux:  Converting 363 SID table entries...
Oct  7 15:22:55 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:22:55 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:22:55 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:22:55 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:22:55 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:22:55 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:22:55 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:23:04 np0005474864 kernel: SELinux:  Converting 363 SID table entries...
Oct  7 15:23:04 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:23:04 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:23:04 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:23:04 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:23:04 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:23:04 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:23:04 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:23:05 np0005474864 setsebool[4762]: The virt_use_nfs policy boolean was changed to 1 by root
Oct  7 15:23:05 np0005474864 setsebool[4762]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Oct  7 15:23:16 np0005474864 kernel: SELinux:  Converting 366 SID table entries...
Oct  7 15:23:16 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:23:16 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:23:16 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:23:16 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:23:16 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:23:16 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:23:16 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:23:34 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  7 15:23:34 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 15:23:34 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 15:23:34 np0005474864 systemd[1]: Reloading.
Oct  7 15:23:34 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:23:34 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 15:23:35 np0005474864 systemd[1]: Starting PackageKit Daemon...
Oct  7 15:23:36 np0005474864 systemd[1]: Starting Authorization Manager...
Oct  7 15:23:36 np0005474864 polkitd[6437]: Started polkitd version 0.117
Oct  7 15:23:36 np0005474864 systemd[1]: Started Authorization Manager.
Oct  7 15:23:36 np0005474864 systemd[1]: Started PackageKit Daemon.
Oct  7 15:23:40 np0005474864 irqbalance[796]: Cannot change IRQ 27 affinity: Operation not permitted
Oct  7 15:23:40 np0005474864 irqbalance[796]: IRQ 27 affinity is now unmanaged
Oct  7 15:23:40 np0005474864 python3[9923]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-d363-7fa2-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:23:41 np0005474864 kernel: evm: overlay not supported
Oct  7 15:23:41 np0005474864 systemd[1064]: Starting D-Bus User Message Bus...
Oct  7 15:23:41 np0005474864 dbus-broker-launch[10518]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Oct  7 15:23:41 np0005474864 dbus-broker-launch[10518]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Oct  7 15:23:41 np0005474864 systemd[1064]: Started D-Bus User Message Bus.
Oct  7 15:23:41 np0005474864 dbus-broker-lau[10518]: Ready
Oct  7 15:23:41 np0005474864 systemd[1064]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Oct  7 15:23:41 np0005474864 systemd[1064]: Created slice Slice /user.
Oct  7 15:23:41 np0005474864 systemd[1064]: podman-10499.scope: unit configures an IP firewall, but not running as root.
Oct  7 15:23:41 np0005474864 systemd[1064]: (This warning is only shown for the first unit using IP firewalling.)
Oct  7 15:23:41 np0005474864 systemd[1064]: Started podman-10499.scope.
Oct  7 15:23:41 np0005474864 systemd[1064]: Started podman-pause-311ec862.scope.
Oct  7 15:23:42 np0005474864 python3[10724]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.110:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.110:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:23:42 np0005474864 systemd-logind[805]: Session 6 logged out. Waiting for processes to exit.
Oct  7 15:23:42 np0005474864 systemd[1]: session-6.scope: Deactivated successfully.
Oct  7 15:23:42 np0005474864 systemd[1]: session-6.scope: Consumed 58.588s CPU time.
Oct  7 15:23:42 np0005474864 systemd-logind[805]: Removed session 6.
Oct  7 15:24:05 np0005474864 systemd-logind[805]: New session 7 of user zuul.
Oct  7 15:24:05 np0005474864 systemd[1]: Started Session 7 of User zuul.
Oct  7 15:24:06 np0005474864 python3[20095]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMPyBpL+9rfsf6+cDdS8+kSc4D8yMXezgBsDZgRMtoa+gMKpIcOyv0QSAhLcPFVHGPWf6jc7/7uKyXxv6hoUAT0= zuul@np0005474861.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:24:06 np0005474864 python3[20339]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMPyBpL+9rfsf6+cDdS8+kSc4D8yMXezgBsDZgRMtoa+gMKpIcOyv0QSAhLcPFVHGPWf6jc7/7uKyXxv6hoUAT0= zuul@np0005474861.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:24:07 np0005474864 python3[20726]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005474864.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Oct  7 15:24:08 np0005474864 python3[20931]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMPyBpL+9rfsf6+cDdS8+kSc4D8yMXezgBsDZgRMtoa+gMKpIcOyv0QSAhLcPFVHGPWf6jc7/7uKyXxv6hoUAT0= zuul@np0005474861.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Oct  7 15:24:08 np0005474864 python3[21192]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:24:09 np0005474864 python3[21439]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759865048.3956187-170-210210704810511/source _original_basename=tmpn6ktxaqw follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:24:10 np0005474864 python3[21738]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Oct  7 15:24:10 np0005474864 systemd[1]: Starting Hostname Service...
Oct  7 15:24:10 np0005474864 systemd[1]: Started Hostname Service.
Oct  7 15:24:10 np0005474864 systemd-hostnamed[21842]: Changed pretty hostname to 'compute-2'
Oct  7 15:24:10 np0005474864 systemd-hostnamed[21842]: Hostname set to <compute-2> (static)
Oct  7 15:24:10 np0005474864 NetworkManager[3955]: <info>  [1759865050.1630] hostname: static hostname changed from "np0005474864.novalocal" to "compute-2"
Oct  7 15:24:10 np0005474864 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  7 15:24:10 np0005474864 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  7 15:24:10 np0005474864 systemd[1]: session-7.scope: Deactivated successfully.
Oct  7 15:24:10 np0005474864 systemd[1]: session-7.scope: Consumed 2.213s CPU time.
Oct  7 15:24:10 np0005474864 systemd-logind[805]: Session 7 logged out. Waiting for processes to exit.
Oct  7 15:24:10 np0005474864 systemd-logind[805]: Removed session 7.
Oct  7 15:24:20 np0005474864 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  7 15:24:22 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 15:24:22 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 15:24:22 np0005474864 systemd[1]: man-db-cache-update.service: Consumed 55.776s CPU time.
Oct  7 15:24:22 np0005474864 systemd[1]: run-raffe448566564d4295c4f8046c814741.service: Deactivated successfully.
Oct  7 15:24:40 np0005474864 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  7 15:27:36 np0005474864 systemd-logind[805]: New session 8 of user zuul.
Oct  7 15:27:36 np0005474864 systemd[1]: Started Session 8 of User zuul.
Oct  7 15:27:37 np0005474864 python3[26650]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:27:39 np0005474864 python3[26766]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:27:39 np0005474864 python3[26839]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759865258.9023519-30586-235080922510188/source mode=0755 _original_basename=delorean.repo follow=False checksum=c02c26d38f431b15f6463fc53c3d93ed5138ff07 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:27:39 np0005474864 python3[26865]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:27:40 np0005474864 python3[26938]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759865258.9023519-30586-235080922510188/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=0bdbb813b840548359ae77c28d76ca272ccaf31b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:27:40 np0005474864 python3[26964]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:27:40 np0005474864 python3[27037]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759865258.9023519-30586-235080922510188/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:27:41 np0005474864 python3[27063]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:27:41 np0005474864 python3[27136]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759865258.9023519-30586-235080922510188/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:27:41 np0005474864 python3[27162]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:27:42 np0005474864 python3[27235]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759865258.9023519-30586-235080922510188/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:27:42 np0005474864 python3[27261]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:27:42 np0005474864 python3[27334]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759865258.9023519-30586-235080922510188/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:27:42 np0005474864 python3[27360]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Oct  7 15:27:43 np0005474864 python3[27433]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1759865258.9023519-30586-235080922510188/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=75ca8f9fe9a538824fd094f239c30e8ce8652e8a backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:27:55 np0005474864 python3[27481]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:27:55 np0005474864 systemd[1]: Starting Cleanup of Temporary Directories...
Oct  7 15:27:55 np0005474864 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Oct  7 15:27:55 np0005474864 systemd[1]: Finished Cleanup of Temporary Directories.
Oct  7 15:27:55 np0005474864 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Oct  7 15:28:41 np0005474864 systemd[1]: packagekit.service: Deactivated successfully.
Oct  7 15:32:54 np0005474864 systemd[1]: session-8.scope: Deactivated successfully.
Oct  7 15:32:54 np0005474864 systemd[1]: session-8.scope: Consumed 5.000s CPU time.
Oct  7 15:32:54 np0005474864 systemd-logind[805]: Session 8 logged out. Waiting for processes to exit.
Oct  7 15:32:54 np0005474864 systemd-logind[805]: Removed session 8.
Oct  7 15:39:50 np0005474864 systemd-logind[805]: New session 9 of user zuul.
Oct  7 15:39:50 np0005474864 systemd[1]: Started Session 9 of User zuul.
Oct  7 15:39:51 np0005474864 python3.9[27655]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:39:52 np0005474864 python3.9[27836]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:40:00 np0005474864 systemd[1]: session-9.scope: Deactivated successfully.
Oct  7 15:40:00 np0005474864 systemd[1]: session-9.scope: Consumed 8.159s CPU time.
Oct  7 15:40:00 np0005474864 systemd-logind[805]: Session 9 logged out. Waiting for processes to exit.
Oct  7 15:40:00 np0005474864 systemd-logind[805]: Removed session 9.
Oct  7 15:40:05 np0005474864 systemd-logind[805]: New session 10 of user zuul.
Oct  7 15:40:05 np0005474864 systemd[1]: Started Session 10 of User zuul.
Oct  7 15:40:06 np0005474864 python3.9[28048]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:40:06 np0005474864 systemd[1]: session-10.scope: Deactivated successfully.
Oct  7 15:40:06 np0005474864 systemd-logind[805]: Session 10 logged out. Waiting for processes to exit.
Oct  7 15:40:06 np0005474864 systemd-logind[805]: Removed session 10.
Oct  7 15:40:22 np0005474864 systemd-logind[805]: New session 11 of user zuul.
Oct  7 15:40:22 np0005474864 systemd[1]: Started Session 11 of User zuul.
Oct  7 15:40:23 np0005474864 python3.9[28229]: ansible-ansible.legacy.ping Invoked with data=pong
Oct  7 15:40:24 np0005474864 python3.9[28403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:40:25 np0005474864 python3.9[28555]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:40:26 np0005474864 python3.9[28708]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:40:28 np0005474864 python3.9[28860]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:40:28 np0005474864 python3.9[29012]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:40:29 np0005474864 python3.9[29135]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866028.39091-180-268761372692793/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:40:30 np0005474864 python3.9[29287]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:40:31 np0005474864 python3.9[29443]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:40:32 np0005474864 python3.9[29593]: ansible-ansible.builtin.service_facts Invoked
Oct  7 15:40:38 np0005474864 python3.9[29848]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:40:39 np0005474864 python3.9[29998]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:40:40 np0005474864 python3.9[30152]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:40:42 np0005474864 python3.9[30310]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:40:43 np0005474864 python3.9[30394]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:41:31 np0005474864 systemd[1]: Reloading.
Oct  7 15:41:31 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:41:31 np0005474864 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Oct  7 15:41:32 np0005474864 systemd[1]: Reloading.
Oct  7 15:41:32 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:41:32 np0005474864 systemd[1]: Starting dnf makecache...
Oct  7 15:41:32 np0005474864 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Oct  7 15:41:32 np0005474864 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Oct  7 15:41:32 np0005474864 systemd[1]: Reloading.
Oct  7 15:41:32 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:41:32 np0005474864 dnf[30632]: Failed determining last makecache time.
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-barbican-42b4c41831408a8e323 163 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 171 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-cinder-1c00d6490d88e436f26ef 202 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 systemd[1]: Listening on LVM2 poll daemon socket.
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-python-stevedore-c4acc5639fd2329372142 200 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-python-cloudkitty-tests-tempest-3961dc 201 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-diskimage-builder-43381184423c185801b5 209 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 190 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-python-designate-tests-tempest-347fdbc 184 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-glance-1fd12c29b339f30fe823e 182 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 178 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-manila-3c01b7181572c95dac462 196 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-python-vmware-nsxlib-458234972d1428ac9 179 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-octavia-ba397f07a7331190208c 162 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-watcher-c014f81a8647287f6dcc 179 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-edpm-image-builder-55ba53cf215b14ed95b 187 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Oct  7 15:41:32 np0005474864 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-puppet-ceph-b0c245ccde541a63fde0564366 187 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-openstack-swift-dc98a8463506ac520c469a 193 kB/s | 3.0 kB     00:00
Oct  7 15:41:32 np0005474864 dnf[30632]: delorean-python-tempestconf-8515371b7cceebd4282 186 kB/s | 3.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: delorean-openstack-heat-ui-013accbfd179753bc3f0 194 kB/s | 3.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: CentOS Stream 9 - BaseOS                         71 kB/s | 6.7 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: CentOS Stream 9 - AppStream                      69 kB/s | 6.8 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: CentOS Stream 9 - CRB                            78 kB/s | 6.6 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: CentOS Stream 9 - Extras packages                29 kB/s | 8.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: dlrn-antelope-testing                           200 kB/s | 3.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: dlrn-antelope-build-deps                        200 kB/s | 3.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: centos9-rabbitmq                                 93 kB/s | 3.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: centos9-storage                                  43 kB/s | 3.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: centos9-opstools                                 81 kB/s | 3.0 kB     00:00
Oct  7 15:41:33 np0005474864 dnf[30632]: NFV SIG OpenvSwitch                              75 kB/s | 3.0 kB     00:00
Oct  7 15:41:34 np0005474864 dnf[30632]: repo-setup-centos-appstream                     110 kB/s | 4.4 kB     00:00
Oct  7 15:41:34 np0005474864 dnf[30632]: repo-setup-centos-baseos                         88 kB/s | 3.9 kB     00:00
Oct  7 15:41:34 np0005474864 dnf[30632]: repo-setup-centos-highavailability              106 kB/s | 3.9 kB     00:00
Oct  7 15:41:34 np0005474864 dnf[30632]: repo-setup-centos-powertools                    143 kB/s | 4.3 kB     00:00
Oct  7 15:41:34 np0005474864 dnf[30632]: Extra Packages for Enterprise Linux 9 - x86_64  275 kB/s |  34 kB     00:00
Oct  7 15:41:34 np0005474864 dnf[30632]: Metadata cache created.
Oct  7 15:41:35 np0005474864 systemd[1]: dnf-makecache.service: Deactivated successfully.
Oct  7 15:41:35 np0005474864 systemd[1]: Finished dnf makecache.
Oct  7 15:41:35 np0005474864 systemd[1]: dnf-makecache.service: Consumed 1.642s CPU time.
Oct  7 15:42:35 np0005474864 kernel: SELinux:  Converting 2714 SID table entries...
Oct  7 15:42:35 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:42:35 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:42:35 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:42:35 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:42:35 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:42:35 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:42:35 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:42:36 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Oct  7 15:42:36 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 15:42:36 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 15:42:36 np0005474864 systemd[1]: Reloading.
Oct  7 15:42:36 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:42:36 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 15:42:37 np0005474864 systemd[1]: Starting PackageKit Daemon...
Oct  7 15:42:37 np0005474864 systemd[1]: Started PackageKit Daemon.
Oct  7 15:42:37 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 15:42:37 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 15:42:37 np0005474864 systemd[1]: man-db-cache-update.service: Consumed 1.268s CPU time.
Oct  7 15:42:37 np0005474864 systemd[1]: run-r88cf4fddb0a746a983b50b8d9b0777fd.service: Deactivated successfully.
Oct  7 15:42:38 np0005474864 python3.9[31950]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:42:40 np0005474864 python3.9[32231]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Oct  7 15:42:41 np0005474864 python3.9[32383]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Oct  7 15:42:47 np0005474864 python3.9[32536]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:42:57 np0005474864 python3.9[32688]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Oct  7 15:42:59 np0005474864 python3.9[32840]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:43:00 np0005474864 python3.9[32992]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:43:00 np0005474864 python3.9[33115]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866179.5799394-641-231795085000401/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:43:02 np0005474864 python3.9[33267]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Oct  7 15:43:03 np0005474864 python3.9[33420]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  7 15:43:03 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 15:43:04 np0005474864 python3.9[33579]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  7 15:43:05 np0005474864 python3.9[33739]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Oct  7 15:43:06 np0005474864 python3.9[33892]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  7 15:43:07 np0005474864 python3.9[34050]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Oct  7 15:43:08 np0005474864 python3.9[34202]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:43:11 np0005474864 python3.9[34355]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:43:11 np0005474864 python3.9[34507]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:43:12 np0005474864 python3.9[34630]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866191.3354752-928-116157403520700/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:43:13 np0005474864 python3.9[34782]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:43:13 np0005474864 systemd[1]: Starting Load Kernel Modules...
Oct  7 15:43:13 np0005474864 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Oct  7 15:43:13 np0005474864 kernel: Bridge firewalling registered
Oct  7 15:43:13 np0005474864 systemd-modules-load[34786]: Inserted module 'br_netfilter'
Oct  7 15:43:13 np0005474864 systemd[1]: Finished Load Kernel Modules.
Oct  7 15:43:14 np0005474864 python3.9[34943]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:43:15 np0005474864 python3.9[35066]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866194.27814-996-152939466891779/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:43:16 np0005474864 python3.9[35218]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:43:19 np0005474864 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Oct  7 15:43:19 np0005474864 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Oct  7 15:43:20 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 15:43:20 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 15:43:20 np0005474864 systemd[1]: Reloading.
Oct  7 15:43:20 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:43:20 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 15:43:22 np0005474864 python3.9[36747]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:43:23 np0005474864 python3.9[37706]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Oct  7 15:43:23 np0005474864 python3.9[38498]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:43:24 np0005474864 python3.9[39381]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:43:24 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 15:43:24 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 15:43:24 np0005474864 systemd[1]: man-db-cache-update.service: Consumed 5.737s CPU time.
Oct  7 15:43:24 np0005474864 systemd[1]: run-rd8391515bce944a0a03c7d2bc2dd49a9.service: Deactivated successfully.
Oct  7 15:43:24 np0005474864 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  7 15:43:25 np0005474864 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  7 15:43:26 np0005474864 python3.9[39755]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:43:26 np0005474864 systemd[1]: Stopping Dynamic System Tuning Daemon...
Oct  7 15:43:26 np0005474864 systemd[1]: tuned.service: Deactivated successfully.
Oct  7 15:43:26 np0005474864 systemd[1]: Stopped Dynamic System Tuning Daemon.
Oct  7 15:43:26 np0005474864 systemd[1]: Starting Dynamic System Tuning Daemon...
Oct  7 15:43:26 np0005474864 systemd[1]: Started Dynamic System Tuning Daemon.
Oct  7 15:43:27 np0005474864 python3.9[39916]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Oct  7 15:43:31 np0005474864 python3.9[40068]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:43:31 np0005474864 systemd[1]: Reloading.
Oct  7 15:43:31 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:43:32 np0005474864 python3.9[40257]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:43:32 np0005474864 systemd[1]: Reloading.
Oct  7 15:43:32 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:43:33 np0005474864 python3.9[40446]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:43:35 np0005474864 python3.9[40599]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:43:35 np0005474864 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Oct  7 15:43:36 np0005474864 python3.9[40752]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:43:38 np0005474864 python3.9[40914]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:43:39 np0005474864 python3.9[41067]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:43:39 np0005474864 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Oct  7 15:43:39 np0005474864 systemd[1]: Stopped Apply Kernel Variables.
Oct  7 15:43:39 np0005474864 systemd[1]: Stopping Apply Kernel Variables...
Oct  7 15:43:39 np0005474864 systemd[1]: Starting Apply Kernel Variables...
Oct  7 15:43:39 np0005474864 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Oct  7 15:43:39 np0005474864 systemd[1]: Finished Apply Kernel Variables.
Oct  7 15:43:39 np0005474864 systemd[1]: session-11.scope: Deactivated successfully.
Oct  7 15:43:39 np0005474864 systemd[1]: session-11.scope: Consumed 2min 14.105s CPU time.
Oct  7 15:43:39 np0005474864 systemd-logind[805]: Session 11 logged out. Waiting for processes to exit.
Oct  7 15:43:39 np0005474864 systemd-logind[805]: Removed session 11.
Oct  7 15:43:45 np0005474864 systemd-logind[805]: New session 12 of user zuul.
Oct  7 15:43:45 np0005474864 systemd[1]: Started Session 12 of User zuul.
Oct  7 15:43:46 np0005474864 python3.9[41251]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:43:47 np0005474864 python3.9[41405]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:43:48 np0005474864 python3.9[41561]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:43:49 np0005474864 python3.9[41712]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:43:50 np0005474864 python3.9[41868]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:43:51 np0005474864 python3.9[41952]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:43:53 np0005474864 python3.9[42105]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:43:55 np0005474864 python3.9[42276]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:43:56 np0005474864 python3.9[42428]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:43:56 np0005474864 systemd[1]: var-lib-containers-storage-overlay-compat3212994284-merged.mount: Deactivated successfully.
Oct  7 15:43:56 np0005474864 podman[42429]: 2025-10-07 19:43:56.156793632 +0000 UTC m=+0.064333177 system refresh
Oct  7 15:43:57 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:43:57 np0005474864 python3.9[42591]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:43:58 np0005474864 python3.9[42714]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866236.4382474-289-78519506561473/.source.json follow=False _original_basename=podman_network_config.j2 checksum=5e4aa567d7177d9ec29e7e7908108d53612c9757 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:43:58 np0005474864 python3.9[42866]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:43:59 np0005474864 python3.9[42989]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866238.2754443-335-80464635594109/.source.conf follow=False _original_basename=registries.conf.j2 checksum=ab0610e0f472dc1e1d78a5bc4899a6884e6f2bfe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:44:00 np0005474864 python3.9[43141]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:44:01 np0005474864 python3.9[43293]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:44:02 np0005474864 python3.9[43445]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:44:02 np0005474864 python3.9[43597]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:44:04 np0005474864 python3.9[43747]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:44:04 np0005474864 python3.9[43901]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:06 np0005474864 python3.9[44054]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:09 np0005474864 python3.9[44214]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:11 np0005474864 python3.9[44367]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:14 np0005474864 python3.9[44520]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:16 np0005474864 python3.9[44676]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:20 np0005474864 python3.9[44844]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:22 np0005474864 python3.9[44997]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:44:42 np0005474864 python3.9[45334]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:44:43 np0005474864 python3.9[45509]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:44:44 np0005474864 python3.9[45632]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1759866282.9367337-751-139086747573879/.source.json _original_basename=.so5pj9xx follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:44:45 np0005474864 python3.9[45784]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  7 15:44:45 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:48 np0005474864 systemd[1]: var-lib-containers-storage-overlay-compat3678758011-lower\x2dmapped.mount: Deactivated successfully.
Oct  7 15:44:53 np0005474864 podman[45797]: 2025-10-07 19:44:53.561455186 +0000 UTC m=+7.924366214 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  7 15:44:53 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:53 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:53 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:54 np0005474864 python3.9[46099]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  7 15:44:54 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:57 np0005474864 podman[46112]: 2025-10-07 19:44:57.525180856 +0000 UTC m=+2.623946637 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  7 15:44:57 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:57 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:57 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:44:58 np0005474864 python3.9[46369]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  7 15:44:58 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:09 np0005474864 podman[46382]: 2025-10-07 19:45:09.70975459 +0000 UTC m=+10.873834423 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 15:45:09 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:09 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:09 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:11 np0005474864 python3.9[46677]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  7 15:45:11 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:12 np0005474864 podman[46688]: 2025-10-07 19:45:12.232851224 +0000 UTC m=+1.057736974 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  7 15:45:12 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:12 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:12 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:13 np0005474864 python3.9[46925]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  7 15:45:13 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:34 np0005474864 podman[46937]: 2025-10-07 19:45:34.13154282 +0000 UTC m=+20.547656037 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  7 15:45:34 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:34 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:34 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:35 np0005474864 python3.9[47203]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  7 15:45:35 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:38 np0005474864 podman[47215]: 2025-10-07 19:45:38.537991405 +0000 UTC m=+2.923581968 image pull 5397cd841d80292a5786d82cb8a2bcd574988efb08c605ba6eaaa59d6f646815 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  7 15:45:38 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:38 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:38 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:39 np0005474864 python3.9[47472]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Oct  7 15:45:39 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:40 np0005474864 podman[47484]: 2025-10-07 19:45:40.720544435 +0000 UTC m=+1.152245102 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  7 15:45:40 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:40 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:40 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:45:42 np0005474864 systemd-logind[805]: Session 12 logged out. Waiting for processes to exit.
Oct  7 15:45:42 np0005474864 systemd[1]: session-12.scope: Deactivated successfully.
Oct  7 15:45:42 np0005474864 systemd[1]: session-12.scope: Consumed 1min 58.568s CPU time.
Oct  7 15:45:42 np0005474864 systemd-logind[805]: Removed session 12.
Oct  7 15:45:48 np0005474864 systemd-logind[805]: New session 13 of user zuul.
Oct  7 15:45:48 np0005474864 systemd[1]: Started Session 13 of User zuul.
Oct  7 15:45:49 np0005474864 python3.9[47787]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:45:51 np0005474864 python3.9[47943]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Oct  7 15:45:52 np0005474864 python3.9[48097]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  7 15:45:53 np0005474864 python3.9[48255]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  7 15:45:57 np0005474864 python3.9[48415]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:45:58 np0005474864 python3.9[48499]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:46:01 np0005474864 python3.9[48660]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:46:13 np0005474864 kernel: SELinux:  Converting 2725 SID table entries...
Oct  7 15:46:13 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:46:13 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:46:13 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:46:13 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:46:13 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:46:13 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:46:13 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:46:13 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Oct  7 15:46:13 np0005474864 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Oct  7 15:46:15 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 15:46:15 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 15:46:15 np0005474864 systemd[1]: Reloading.
Oct  7 15:46:15 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:46:15 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:46:15 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 15:46:16 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 15:46:16 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 15:46:16 np0005474864 systemd[1]: man-db-cache-update.service: Consumed 1.093s CPU time.
Oct  7 15:46:16 np0005474864 systemd[1]: run-rc1610c3ae4d34b4d9da6c33b7c62313b.service: Deactivated successfully.
Oct  7 15:46:17 np0005474864 python3.9[49762]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:46:17 np0005474864 systemd[1]: Reloading.
Oct  7 15:46:17 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:46:17 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:46:18 np0005474864 systemd[1]: Starting Open vSwitch Database Unit...
Oct  7 15:46:18 np0005474864 chown[49803]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Oct  7 15:46:18 np0005474864 ovs-ctl[49808]: /etc/openvswitch/conf.db does not exist ... (warning).
Oct  7 15:46:18 np0005474864 ovs-ctl[49808]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Oct  7 15:46:18 np0005474864 ovs-ctl[49808]: Starting ovsdb-server [  OK  ]
Oct  7 15:46:18 np0005474864 ovs-vsctl[49857]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Oct  7 15:46:18 np0005474864 ovs-vsctl[49877]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"2d917af9-e2c2-4b32-93ba-e5708271f327\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Oct  7 15:46:18 np0005474864 ovs-ctl[49808]: Configuring Open vSwitch system IDs [  OK  ]
Oct  7 15:46:18 np0005474864 ovs-ctl[49808]: Enabling remote OVSDB managers [  OK  ]
Oct  7 15:46:18 np0005474864 ovs-vsctl[49883]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct  7 15:46:18 np0005474864 systemd[1]: Started Open vSwitch Database Unit.
Oct  7 15:46:18 np0005474864 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Oct  7 15:46:18 np0005474864 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Oct  7 15:46:18 np0005474864 systemd[1]: Starting Open vSwitch Forwarding Unit...
Oct  7 15:46:18 np0005474864 kernel: openvswitch: Open vSwitch switching datapath
Oct  7 15:46:18 np0005474864 ovs-ctl[49927]: Inserting openvswitch module [  OK  ]
Oct  7 15:46:18 np0005474864 ovs-ctl[49896]: Starting ovs-vswitchd [  OK  ]
Oct  7 15:46:18 np0005474864 ovs-vsctl[49944]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Oct  7 15:46:18 np0005474864 ovs-ctl[49896]: Enabling remote OVSDB managers [  OK  ]
Oct  7 15:46:18 np0005474864 systemd[1]: Started Open vSwitch Forwarding Unit.
Oct  7 15:46:18 np0005474864 systemd[1]: Starting Open vSwitch...
Oct  7 15:46:18 np0005474864 systemd[1]: Finished Open vSwitch.
Oct  7 15:46:19 np0005474864 python3.9[50096]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:46:21 np0005474864 python3.9[50248]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Oct  7 15:46:22 np0005474864 kernel: SELinux:  Converting 2739 SID table entries...
Oct  7 15:46:22 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:46:22 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:46:22 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:46:22 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:46:22 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:46:22 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:46:22 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:46:23 np0005474864 python3.9[50403]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:46:24 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Oct  7 15:46:24 np0005474864 python3.9[50561]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:46:27 np0005474864 python3.9[50714]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:46:28 np0005474864 python3.9[51001]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  7 15:46:29 np0005474864 python3.9[51151]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:46:30 np0005474864 python3.9[51305]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:46:32 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 15:46:32 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 15:46:32 np0005474864 systemd[1]: Reloading.
Oct  7 15:46:32 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:46:32 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:46:32 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 15:46:33 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 15:46:33 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 15:46:33 np0005474864 systemd[1]: run-rbab86cbb8d514cb18744a34fd952dcad.service: Deactivated successfully.
Oct  7 15:46:34 np0005474864 python3.9[51622]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:46:35 np0005474864 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Oct  7 15:46:35 np0005474864 systemd[1]: Stopped Network Manager Wait Online.
Oct  7 15:46:35 np0005474864 systemd[1]: Stopping Network Manager Wait Online...
Oct  7 15:46:35 np0005474864 systemd[1]: Stopping Network Manager...
Oct  7 15:46:35 np0005474864 NetworkManager[3955]: <info>  [1759866395.4279] caught SIGTERM, shutting down normally.
Oct  7 15:46:35 np0005474864 NetworkManager[3955]: <info>  [1759866395.4298] dhcp4 (eth0): canceled DHCP transaction
Oct  7 15:46:35 np0005474864 NetworkManager[3955]: <info>  [1759866395.4299] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:46:35 np0005474864 NetworkManager[3955]: <info>  [1759866395.4299] dhcp4 (eth0): state changed no lease
Oct  7 15:46:35 np0005474864 NetworkManager[3955]: <info>  [1759866395.4301] manager: NetworkManager state is now CONNECTED_SITE
Oct  7 15:46:35 np0005474864 NetworkManager[3955]: <info>  [1759866395.4374] exiting (success)
Oct  7 15:46:35 np0005474864 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  7 15:46:35 np0005474864 systemd[1]: NetworkManager.service: Deactivated successfully.
Oct  7 15:46:35 np0005474864 systemd[1]: Stopped Network Manager.
Oct  7 15:46:35 np0005474864 systemd[1]: NetworkManager.service: Consumed 10.282s CPU time, 4.1M memory peak, read 0B from disk, written 32.5K to disk.
Oct  7 15:46:35 np0005474864 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  7 15:46:35 np0005474864 systemd[1]: Starting Network Manager...
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.5067] NetworkManager (version 1.54.1-1.el9) is starting... (after a restart, boot:6a778c83-97cd-4db9-828a-f91a822d201d)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.5072] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.5145] manager[0x557846666090]: monitoring kernel firmware directory '/lib/firmware'.
Oct  7 15:46:35 np0005474864 systemd[1]: Starting Hostname Service...
Oct  7 15:46:35 np0005474864 systemd[1]: Started Hostname Service.
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6086] hostname: hostname: using hostnamed
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6087] hostname: static hostname changed from (none) to "compute-2"
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6094] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6102] manager[0x557846666090]: rfkill: Wi-Fi hardware radio set enabled
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6103] manager[0x557846666090]: rfkill: WWAN hardware radio set enabled
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6139] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-ovs.so)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6153] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-device-plugin-team.so)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6154] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6155] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6156] manager: Networking is enabled by state file
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6160] settings: Loaded settings plugin: keyfile (internal)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6168] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.1-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6234] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6251] dhcp: init: Using DHCP client 'internal'
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6256] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6265] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6274] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6287] device (lo): Activation: starting connection 'lo' (7186dd46-7bf7-4d5a-893f-437c9f730689)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6299] device (eth0): carrier: link connected
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6306] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6316] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6317] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6332] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6344] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6354] device (eth1): carrier: link connected
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6361] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6371] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (ec84c6d0-c5d8-5fc3-a61d-f32b4823d873) (indicated)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6373] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6382] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6394] device (eth1): Activation: starting connection 'ci-private-network' (ec84c6d0-c5d8-5fc3-a61d-f32b4823d873)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6406] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Oct  7 15:46:35 np0005474864 systemd[1]: Started Network Manager.
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6424] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6427] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6430] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6433] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6436] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6440] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6444] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6451] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6465] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6469] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6499] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6521] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6535] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6546] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Oct  7 15:46:35 np0005474864 systemd[1]: Starting Network Manager Wait Online...
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6644] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6654] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6657] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6660] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6667] device (lo): Activation: successful, device activated.
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6674] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6679] manager: NetworkManager state is now CONNECTED_LOCAL
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6685] device (eth1): Activation: successful, device activated.
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6694] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6697] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6702] manager: NetworkManager state is now CONNECTED_SITE
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6707] device (eth0): Activation: successful, device activated.
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6712] manager: NetworkManager state is now CONNECTED_GLOBAL
Oct  7 15:46:35 np0005474864 NetworkManager[51631]: <info>  [1759866395.6717] manager: startup complete
Oct  7 15:46:35 np0005474864 systemd[1]: Finished Network Manager Wait Online.
Oct  7 15:46:36 np0005474864 python3.9[51848]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:46:41 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 15:46:41 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 15:46:41 np0005474864 systemd[1]: Reloading.
Oct  7 15:46:41 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:46:41 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:46:41 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 15:46:43 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 15:46:43 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 15:46:43 np0005474864 systemd[1]: run-r000ddb703b1b4d5294d7943ff87cd2e0.service: Deactivated successfully.
Oct  7 15:46:44 np0005474864 python3.9[52315]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:46:45 np0005474864 python3.9[52467]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:45 np0005474864 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  7 15:46:46 np0005474864 python3.9[52621]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:47 np0005474864 python3.9[52773]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:47 np0005474864 python3.9[52925]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:48 np0005474864 python3.9[53077]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:49 np0005474864 python3.9[53229]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:46:50 np0005474864 python3.9[53352]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866409.0523472-649-178156798752712/.source _original_basename=.zfbye0ma follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:51 np0005474864 python3.9[53504]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:52 np0005474864 python3.9[53656]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Oct  7 15:46:53 np0005474864 python3.9[53808]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:46:55 np0005474864 python3.9[54235]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Oct  7 15:46:56 np0005474864 ansible-async_wrapper.py[54410]: Invoked with j397871011299 300 /home/zuul/.ansible/tmp/ansible-tmp-1759866415.871229-847-89522234775529/AnsiballZ_edpm_os_net_config.py _
Oct  7 15:46:56 np0005474864 ansible-async_wrapper.py[54413]: Starting module and watcher
Oct  7 15:46:56 np0005474864 ansible-async_wrapper.py[54413]: Start watching 54414 (300)
Oct  7 15:46:56 np0005474864 ansible-async_wrapper.py[54414]: Start module (54414)
Oct  7 15:46:56 np0005474864 ansible-async_wrapper.py[54410]: Return async_wrapper task started.
Oct  7 15:46:57 np0005474864 python3.9[54415]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Oct  7 15:46:57 np0005474864 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Oct  7 15:46:57 np0005474864 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Oct  7 15:46:57 np0005474864 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Oct  7 15:46:57 np0005474864 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Oct  7 15:46:57 np0005474864 kernel: cfg80211: failed to load regulatory.db
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.8620] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.8646] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9159] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9160] audit: op="connection-add" uuid="832f96a9-2fb6-427b-9fdf-ef3cc997a6d6" name="br-ex-br" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9179] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9180] audit: op="connection-add" uuid="2b6402f1-7d84-476b-a3af-731133007c33" name="br-ex-port" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9191] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9192] audit: op="connection-add" uuid="640d0afc-5d9b-493b-9bcb-fc16dcead799" name="eth1-port" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9205] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9206] audit: op="connection-add" uuid="5737658d-d181-4603-bdbf-c074c1fb5c7c" name="vlan20-port" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9218] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9219] audit: op="connection-add" uuid="06a991d7-9bc7-48b1-b4c2-e7fa3c72d8c0" name="vlan21-port" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9229] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9230] audit: op="connection-add" uuid="8fc0914a-26c6-4442-b931-0a7fd396729c" name="vlan22-port" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9251] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.method,ipv6.dhcp-timeout,ipv6.addr-gen-mode,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9265] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9266] audit: op="connection-add" uuid="f623b59a-4f3e-420c-b47b-6b1f15b1cfc7" name="br-ex-if" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9354] audit: op="connection-update" uuid="ec84c6d0-c5d8-5fc3-a61d-f32b4823d873" name="ci-private-network" args="ovs-interface.type,ipv6.addresses,ipv6.method,ipv6.routes,ipv6.addr-gen-mode,ipv6.dns,ipv6.routing-rules,connection.master,connection.timestamp,connection.port-type,connection.controller,connection.slave-type,ipv4.addresses,ipv4.method,ipv4.never-default,ipv4.routes,ipv4.dns,ipv4.routing-rules,ovs-external-ids.data" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9370] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9371] audit: op="connection-add" uuid="38204bb5-7158-4de0-a24b-fe650eb6f454" name="vlan20-if" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9385] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9386] audit: op="connection-add" uuid="5100d297-c561-49ef-972e-aceedba28738" name="vlan21-if" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9400] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9401] audit: op="connection-add" uuid="2c8cf98d-b684-4e5f-9f06-319698cc8cda" name="vlan22-if" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9412] audit: op="connection-delete" uuid="9410294d-aa1e-3bb4-8bdc-9291d424595c" name="Wired connection 1" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9425] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9436] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9439] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (832f96a9-2fb6-427b-9fdf-ef3cc997a6d6)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9439] audit: op="connection-activate" uuid="832f96a9-2fb6-427b-9fdf-ef3cc997a6d6" name="br-ex-br" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9440] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9445] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9447] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (2b6402f1-7d84-476b-a3af-731133007c33)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9449] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9452] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9455] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (640d0afc-5d9b-493b-9bcb-fc16dcead799)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9456] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9461] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9463] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (5737658d-d181-4603-bdbf-c074c1fb5c7c)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9464] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9469] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9472] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (06a991d7-9bc7-48b1-b4c2-e7fa3c72d8c0)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9473] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9478] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9480] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (8fc0914a-26c6-4442-b931-0a7fd396729c)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9481] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9482] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9484] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9488] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9491] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9494] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (f623b59a-4f3e-420c-b47b-6b1f15b1cfc7)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9495] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9497] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9499] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9499] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9500] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9508] device (eth1): disconnecting for new activation request.
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9508] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9510] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9512] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9513] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9515] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9518] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9520] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (38204bb5-7158-4de0-a24b-fe650eb6f454)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9521] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9523] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9524] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9524] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9526] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9529] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9532] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (5100d297-c561-49ef-972e-aceedba28738)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9533] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9534] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9535] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9536] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9538] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9541] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9544] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (2c8cf98d-b684-4e5f-9f06-319698cc8cda)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9545] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9547] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9548] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9549] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9550] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9559] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.method,ipv6.addr-gen-mode,802-3-ethernet.mtu,connection.autoconnect-priority,ipv4.dhcp-timeout,ipv4.dhcp-client-id" pid=54416 uid=0 result="success"
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9563] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9565] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9566] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9571] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9574] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9578] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9580] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9582] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9586] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9588] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9591] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9592] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9595] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9598] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9600] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9601] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 kernel: ovs-system: entered promiscuous mode
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9605] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9608] dhcp4 (eth0): canceled DHCP transaction
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9608] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9608] dhcp4 (eth0): state changed no lease
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9610] dhcp4 (eth0): activation: beginning transaction (no timeout)
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9621] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 kernel: Timeout policy base is empty
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9624] audit: op="device-reapply" interface="eth1" ifindex=3 pid=54416 uid=0 result="fail" reason="Device is not activated"
Oct  7 15:46:58 np0005474864 systemd-udevd[54422]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9630] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  7 15:46:58 np0005474864 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9904] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9909] dhcp4 (eth0): state changed new lease, address=38.102.83.243
Oct  7 15:46:58 np0005474864 NetworkManager[51631]: <info>  [1759866418.9918] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Oct  7 15:46:59 np0005474864 kernel: br-ex: entered promiscuous mode
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0044] device (eth1): disconnecting for new activation request.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0045] audit: op="connection-activate" uuid="ec84c6d0-c5d8-5fc3-a61d-f32b4823d873" name="ci-private-network" pid=54416 uid=0 result="success"
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0045] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Oct  7 15:46:59 np0005474864 kernel: vlan22: entered promiscuous mode
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0151] device (eth1): Activation: starting connection 'ci-private-network' (ec84c6d0-c5d8-5fc3-a61d-f32b4823d873)
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0157] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 systemd-udevd[54421]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0164] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0166] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0167] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0169] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0170] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 kernel: vlan21: entered promiscuous mode
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0217] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0221] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0226] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0233] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0238] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0244] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0252] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0258] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0264] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0270] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0277] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0283] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0289] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0298] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 kernel: vlan20: entered promiscuous mode
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0326] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0335] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54416 uid=0 result="success"
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0348] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0363] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0374] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Oct  7 15:46:59 np0005474864 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0387] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0401] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0421] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0443] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0447] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0465] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0474] device (eth1): Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0485] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0498] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0499] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0501] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0506] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0512] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0521] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0527] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0543] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0552] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0559] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0586] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0916] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0920] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Oct  7 15:46:59 np0005474864 NetworkManager[51631]: <info>  [1759866419.0927] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Oct  7 15:47:00 np0005474864 NetworkManager[51631]: <info>  [1759866420.2110] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54416 uid=0 result="success"
Oct  7 15:47:00 np0005474864 NetworkManager[51631]: <info>  [1759866420.4214] checkpoint[0x55784663c950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Oct  7 15:47:00 np0005474864 NetworkManager[51631]: <info>  [1759866420.4218] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=54416 uid=0 result="success"
Oct  7 15:47:00 np0005474864 python3.9[54749]: ansible-ansible.legacy.async_status Invoked with jid=j397871011299.54410 mode=status _async_dir=/root/.ansible_async
Oct  7 15:47:00 np0005474864 NetworkManager[51631]: <info>  [1759866420.8127] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54416 uid=0 result="success"
Oct  7 15:47:00 np0005474864 NetworkManager[51631]: <info>  [1759866420.8139] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54416 uid=0 result="success"
Oct  7 15:47:00 np0005474864 NetworkManager[51631]: <info>  [1759866420.9978] audit: op="networking-control" arg="global-dns-configuration" pid=54416 uid=0 result="success"
Oct  7 15:47:01 np0005474864 NetworkManager[51631]: <info>  [1759866421.0008] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Oct  7 15:47:01 np0005474864 NetworkManager[51631]: <info>  [1759866421.0041] audit: op="networking-control" arg="global-dns-configuration" pid=54416 uid=0 result="success"
Oct  7 15:47:01 np0005474864 NetworkManager[51631]: <info>  [1759866421.0057] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54416 uid=0 result="success"
Oct  7 15:47:01 np0005474864 NetworkManager[51631]: <info>  [1759866421.1807] checkpoint[0x55784663ca20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Oct  7 15:47:01 np0005474864 NetworkManager[51631]: <info>  [1759866421.1816] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=54416 uid=0 result="success"
Oct  7 15:47:01 np0005474864 ansible-async_wrapper.py[54414]: Module complete (54414)
Oct  7 15:47:01 np0005474864 ansible-async_wrapper.py[54413]: Done in kid B.
Oct  7 15:47:04 np0005474864 python3.9[54855]: ansible-ansible.legacy.async_status Invoked with jid=j397871011299.54410 mode=status _async_dir=/root/.ansible_async
Oct  7 15:47:04 np0005474864 python3.9[54954]: ansible-ansible.legacy.async_status Invoked with jid=j397871011299.54410 mode=cleanup _async_dir=/root/.ansible_async
Oct  7 15:47:05 np0005474864 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  7 15:47:05 np0005474864 python3.9[55108]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:47:06 np0005474864 python3.9[55231]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866425.309439-928-97885119621615/.source.returncode _original_basename=.3_2kz8i2 follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:47:07 np0005474864 python3.9[55384]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:47:08 np0005474864 python3.9[55507]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866426.9604716-976-45587508802305/.source.cfg _original_basename=.60_266x0 follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:47:09 np0005474864 python3.9[55659]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:47:09 np0005474864 systemd[1]: Reloading Network Manager...
Oct  7 15:47:09 np0005474864 NetworkManager[51631]: <info>  [1759866429.3350] audit: op="reload" arg="0" pid=55663 uid=0 result="success"
Oct  7 15:47:09 np0005474864 NetworkManager[51631]: <info>  [1759866429.3356] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Oct  7 15:47:09 np0005474864 systemd[1]: Reloaded Network Manager.
Oct  7 15:47:09 np0005474864 systemd[1]: session-13.scope: Deactivated successfully.
Oct  7 15:47:09 np0005474864 systemd[1]: session-13.scope: Consumed 55.584s CPU time.
Oct  7 15:47:09 np0005474864 systemd-logind[805]: Session 13 logged out. Waiting for processes to exit.
Oct  7 15:47:09 np0005474864 systemd-logind[805]: Removed session 13.
Oct  7 15:47:15 np0005474864 systemd-logind[805]: New session 14 of user zuul.
Oct  7 15:47:15 np0005474864 systemd[1]: Started Session 14 of User zuul.
Oct  7 15:47:16 np0005474864 python3.9[55847]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:47:18 np0005474864 python3.9[56002]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:47:19 np0005474864 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  7 15:47:19 np0005474864 python3.9[56191]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:47:19 np0005474864 systemd[1]: session-14.scope: Deactivated successfully.
Oct  7 15:47:19 np0005474864 systemd[1]: session-14.scope: Consumed 2.705s CPU time.
Oct  7 15:47:19 np0005474864 systemd-logind[805]: Session 14 logged out. Waiting for processes to exit.
Oct  7 15:47:19 np0005474864 systemd-logind[805]: Removed session 14.
Oct  7 15:47:25 np0005474864 systemd-logind[805]: New session 15 of user zuul.
Oct  7 15:47:25 np0005474864 systemd[1]: Started Session 15 of User zuul.
Oct  7 15:47:26 np0005474864 python3.9[56373]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:47:27 np0005474864 python3.9[56528]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:47:29 np0005474864 python3.9[56684]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:47:29 np0005474864 python3.9[56768]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:47:31 np0005474864 python3.9[56922]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:47:33 np0005474864 python3.9[57113]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:47:34 np0005474864 python3.9[57265]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:47:34 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:47:35 np0005474864 python3.9[57428]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:47:36 np0005474864 python3.9[57506]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:47:36 np0005474864 python3.9[57658]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:47:37 np0005474864 python3.9[57736]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:47:38 np0005474864 python3.9[57888]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:47:39 np0005474864 python3.9[58040]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:47:40 np0005474864 python3.9[58192]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:47:40 np0005474864 python3.9[58344]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:47:41 np0005474864 python3.9[58496]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:47:44 np0005474864 python3.9[58649]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:47:44 np0005474864 python3.9[58803]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:47:45 np0005474864 python3.9[58955]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:47:46 np0005474864 python3.9[59107]: ansible-service_facts Invoked
Oct  7 15:47:46 np0005474864 network[59124]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 15:47:46 np0005474864 network[59125]: 'network-scripts' will be removed from distribution in near future.
Oct  7 15:47:46 np0005474864 network[59126]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 15:47:53 np0005474864 python3.9[59581]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:47:56 np0005474864 python3.9[59734]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Oct  7 15:47:58 np0005474864 python3.9[59886]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:47:58 np0005474864 python3.9[60011]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866477.5347412-623-61438528285422/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:47:59 np0005474864 python3.9[60165]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:00 np0005474864 python3.9[60290]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866479.1426048-669-124867999442283/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:02 np0005474864 python3.9[60444]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:04 np0005474864 python3.9[60598]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:48:05 np0005474864 python3.9[60682]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:48:06 np0005474864 python3.9[60836]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:48:07 np0005474864 python3.9[60920]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:48:07 np0005474864 chronyd[792]: chronyd exiting
Oct  7 15:48:07 np0005474864 systemd[1]: Stopping NTP client/server...
Oct  7 15:48:07 np0005474864 systemd[1]: chronyd.service: Deactivated successfully.
Oct  7 15:48:07 np0005474864 systemd[1]: Stopped NTP client/server.
Oct  7 15:48:07 np0005474864 systemd[1]: Starting NTP client/server...
Oct  7 15:48:07 np0005474864 chronyd[60929]: chronyd version 4.6.1 starting (+CMDMON +NTP +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +ASYNCDNS +NTS +SECHASH +IPV6 +DEBUG)
Oct  7 15:48:07 np0005474864 chronyd[60929]: Frequency -28.993 +/- 0.295 ppm read from /var/lib/chrony/drift
Oct  7 15:48:07 np0005474864 chronyd[60929]: Loaded seccomp filter (level 2)
Oct  7 15:48:07 np0005474864 systemd[1]: Started NTP client/server.
Oct  7 15:48:08 np0005474864 systemd[1]: session-15.scope: Deactivated successfully.
Oct  7 15:48:08 np0005474864 systemd[1]: session-15.scope: Consumed 27.777s CPU time.
Oct  7 15:48:08 np0005474864 systemd-logind[805]: Session 15 logged out. Waiting for processes to exit.
Oct  7 15:48:08 np0005474864 systemd-logind[805]: Removed session 15.
Oct  7 15:48:14 np0005474864 systemd-logind[805]: New session 16 of user zuul.
Oct  7 15:48:14 np0005474864 systemd[1]: Started Session 16 of User zuul.
Oct  7 15:48:15 np0005474864 python3.9[61108]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:48:16 np0005474864 python3.9[61264]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:17 np0005474864 python3.9[61439]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:18 np0005474864 python3.9[61517]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=._4523n3p recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:19 np0005474864 python3.9[61669]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:20 np0005474864 python3.9[61792]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866499.022544-145-257953810822678/.source _original_basename=.eifr7ymc follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:21 np0005474864 python3.9[61944]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:48:22 np0005474864 python3.9[62096]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:22 np0005474864 python3.9[62219]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866501.4762495-218-249051906819092/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:48:23 np0005474864 python3.9[62371]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:24 np0005474864 python3.9[62494]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866503.0112698-218-189682889579428/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:48:25 np0005474864 python3.9[62646]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:25 np0005474864 python3.9[62798]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:26 np0005474864 python3.9[62921]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866505.3150725-328-226464492872454/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:27 np0005474864 python3.9[63073]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:28 np0005474864 python3.9[63196]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866506.8528104-373-138286394768207/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:29 np0005474864 python3.9[63348]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:48:29 np0005474864 systemd[1]: Reloading.
Oct  7 15:48:29 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:48:29 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:48:29 np0005474864 systemd[1]: Reloading.
Oct  7 15:48:29 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:48:29 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:48:30 np0005474864 systemd[1]: Starting EDPM Container Shutdown...
Oct  7 15:48:30 np0005474864 systemd[1]: Finished EDPM Container Shutdown.
Oct  7 15:48:30 np0005474864 python3.9[63576]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:31 np0005474864 python3.9[63699]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866510.2883065-444-48399336722856/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:32 np0005474864 python3.9[63851]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:33 np0005474864 python3.9[63974]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866511.8286812-488-250935058572356/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:34 np0005474864 python3.9[64126]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:48:34 np0005474864 systemd[1]: Reloading.
Oct  7 15:48:34 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:48:34 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:48:34 np0005474864 systemd[1]: Reloading.
Oct  7 15:48:34 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:48:34 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:48:34 np0005474864 systemd[1]: Starting Create netns directory...
Oct  7 15:48:34 np0005474864 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  7 15:48:34 np0005474864 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  7 15:48:34 np0005474864 systemd[1]: Finished Create netns directory.
Oct  7 15:48:35 np0005474864 python3.9[64353]: ansible-ansible.builtin.service_facts Invoked
Oct  7 15:48:35 np0005474864 network[64370]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 15:48:35 np0005474864 network[64371]: 'network-scripts' will be removed from distribution in near future.
Oct  7 15:48:35 np0005474864 network[64372]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 15:48:40 np0005474864 python3.9[64636]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:48:40 np0005474864 systemd[1]: Reloading.
Oct  7 15:48:40 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:48:40 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:48:40 np0005474864 systemd[1]: Stopping IPv4 firewall with iptables...
Oct  7 15:48:40 np0005474864 iptables.init[64675]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Oct  7 15:48:40 np0005474864 iptables.init[64675]: iptables: Flushing firewall rules: [  OK  ]
Oct  7 15:48:40 np0005474864 systemd[1]: iptables.service: Deactivated successfully.
Oct  7 15:48:40 np0005474864 systemd[1]: Stopped IPv4 firewall with iptables.
Oct  7 15:48:41 np0005474864 python3.9[64871]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:48:44 np0005474864 python3.9[65025]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:48:44 np0005474864 systemd[1]: Reloading.
Oct  7 15:48:44 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:48:44 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:48:44 np0005474864 systemd[1]: Starting Netfilter Tables...
Oct  7 15:48:44 np0005474864 systemd[1]: Finished Netfilter Tables.
Oct  7 15:48:45 np0005474864 python3.9[65217]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:48:46 np0005474864 python3.9[65370]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:48:47 np0005474864 python3.9[65495]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866526.1386716-695-190153669227845/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=4729b6ffc5b555fa142bf0b6e6dc15609cb89a22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:48:48 np0005474864 python3.9[65646]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:49:14 np0005474864 systemd[1]: session-16.scope: Deactivated successfully.
Oct  7 15:49:14 np0005474864 systemd[1]: session-16.scope: Consumed 24.131s CPU time.
Oct  7 15:49:14 np0005474864 systemd-logind[805]: Session 16 logged out. Waiting for processes to exit.
Oct  7 15:49:14 np0005474864 systemd-logind[805]: Removed session 16.
Oct  7 15:49:27 np0005474864 systemd-logind[805]: New session 17 of user zuul.
Oct  7 15:49:27 np0005474864 systemd[1]: Started Session 17 of User zuul.
Oct  7 15:49:28 np0005474864 python3.9[65839]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:49:29 np0005474864 python3.9[65995]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:30 np0005474864 python3.9[66170]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:31 np0005474864 python3.9[66248]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.i092eyge recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:32 np0005474864 python3.9[66400]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:32 np0005474864 python3.9[66478]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/etc/sysconfig/podman_drop_in _original_basename=.rmh2a822 recurse=False state=file path=/etc/sysconfig/podman_drop_in force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:33 np0005474864 python3.9[66630]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:49:34 np0005474864 python3.9[66782]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:34 np0005474864 python3.9[66860]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:49:35 np0005474864 python3.9[67012]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:36 np0005474864 python3.9[67090]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:49:37 np0005474864 python3.9[67242]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:37 np0005474864 python3.9[67394]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:38 np0005474864 python3.9[67472]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:39 np0005474864 python3.9[67624]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:39 np0005474864 python3.9[67702]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:41 np0005474864 python3.9[67854]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:49:41 np0005474864 systemd[1]: Reloading.
Oct  7 15:49:41 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:49:41 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:49:42 np0005474864 python3.9[68043]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:42 np0005474864 python3.9[68121]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:43 np0005474864 python3.9[68273]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:44 np0005474864 python3.9[68351]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:45 np0005474864 python3.9[68503]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:49:45 np0005474864 systemd[1]: Reloading.
Oct  7 15:49:45 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:49:45 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:49:45 np0005474864 systemd[1]: Starting Create netns directory...
Oct  7 15:49:45 np0005474864 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  7 15:49:45 np0005474864 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  7 15:49:45 np0005474864 systemd[1]: Finished Create netns directory.
Oct  7 15:49:47 np0005474864 python3.9[68694]: ansible-ansible.builtin.service_facts Invoked
Oct  7 15:49:47 np0005474864 network[68711]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 15:49:47 np0005474864 network[68712]: 'network-scripts' will be removed from distribution in near future.
Oct  7 15:49:47 np0005474864 network[68713]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 15:49:53 np0005474864 python3.9[68976]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:53 np0005474864 python3.9[69054]: ansible-ansible.legacy.file Invoked with mode=0600 dest=/etc/ssh/sshd_config _original_basename=sshd_config_block.j2 recurse=False state=file path=/etc/ssh/sshd_config force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:54 np0005474864 python3.9[69206]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:55 np0005474864 python3.9[69358]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:49:56 np0005474864 python3.9[69481]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866594.9542615-610-107209577915807/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:57 np0005474864 python3.9[69633]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Oct  7 15:49:57 np0005474864 systemd[1]: Starting Time & Date Service...
Oct  7 15:49:57 np0005474864 systemd[1]: Started Time & Date Service.
Oct  7 15:49:58 np0005474864 python3.9[69789]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:49:59 np0005474864 python3.9[69941]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:00 np0005474864 python3.9[70064]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866599.1991029-716-224416217340718/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:01 np0005474864 python3.9[70216]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:02 np0005474864 python3.9[70339]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866600.83902-761-213547195254713/.source.yaml _original_basename=.7zngr7_t follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:03 np0005474864 python3.9[70491]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:03 np0005474864 python3.9[70614]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866602.3899398-806-275657402210940/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:04 np0005474864 python3.9[70766]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:50:05 np0005474864 python3.9[70919]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:50:06 np0005474864 python3[71072]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  7 15:50:07 np0005474864 python3.9[71224]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:08 np0005474864 python3.9[71347]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866607.0459878-923-4297777426268/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:09 np0005474864 python3.9[71499]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:09 np0005474864 python3.9[71622]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866608.5484238-968-160458456309763/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:10 np0005474864 python3.9[71774]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:11 np0005474864 python3.9[71897]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866610.1710136-1013-273958531747987/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:12 np0005474864 python3.9[72049]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:13 np0005474864 python3.9[72172]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866611.7708821-1058-174312311790324/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:13 np0005474864 python3.9[72324]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:50:14 np0005474864 python3.9[72447]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866613.2906387-1103-77714506811209/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:15 np0005474864 python3.9[72599]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:16 np0005474864 python3.9[72751]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:50:17 np0005474864 python3.9[72910]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:17 np0005474864 chronyd[60929]: Selected source 167.160.187.12 (pool.ntp.org)
Oct  7 15:50:18 np0005474864 python3.9[73063]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:19 np0005474864 python3.9[73215]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:20 np0005474864 python3.9[73367]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  7 15:50:20 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 15:50:20 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 15:50:21 np0005474864 python3.9[73521]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Oct  7 15:50:21 np0005474864 systemd[1]: session-17.scope: Deactivated successfully.
Oct  7 15:50:21 np0005474864 systemd[1]: session-17.scope: Consumed 38.282s CPU time.
Oct  7 15:50:21 np0005474864 systemd-logind[805]: Session 17 logged out. Waiting for processes to exit.
Oct  7 15:50:21 np0005474864 systemd-logind[805]: Removed session 17.
Oct  7 15:50:27 np0005474864 systemd-logind[805]: New session 18 of user zuul.
Oct  7 15:50:27 np0005474864 systemd[1]: Started Session 18 of User zuul.
Oct  7 15:50:27 np0005474864 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  7 15:50:28 np0005474864 python3.9[73705]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Oct  7 15:50:29 np0005474864 python3.9[73857]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:50:31 np0005474864 python3.9[74009]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:50:32 np0005474864 python3.9[74161]: ansible-ansible.builtin.blockinfile Invoked with block=compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCmNyji/D5xC3wGYSpjZBR91y1eVE5Nmu81dp/XiAKVPfAI8Fpz11LwwQqLVwaZ9bYAKcYUGK78e7gJzPyC5p0XVEWcKpiAH6qHIShmaKi9pG4E9pElZZQIS0vOvteJRAA9EaqYKUfmI/Jd7d55eh01YZOu1mM8ggWcWM8jZVvgQjg+LNRbH7GhYgN1cN0GeGBPqEoN+ev4SB/VDpmYys+d9HS1hMlC9jHwoLDil4l/r+GgbQid349fXW8ulGj9/HUpgIBbyVWaO/cl5NDIJ2GgEIOQYFTkVilv2fUyzucV2oc7pgc5xQqriWNfG5B4UPazJhGcbUZOcRpxphJyvXw1Rq+bE3nQmRjqkexo2T56VLFM2dfSySWApZIyXBOJ8DVR0Ci4NNVrgO5InlsXK6KJoHDOi/JWcJ8+K388SxUiN0D3NwNlLtIJUE0hvCF5xYhxgoIPrENJc74laj7o+yE7Kh1uJEXeWxhXr8jtqZA4VtgDVMrdjiZVTUyN/ehgb0c=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEAsKJiVzzvN1iKafw+86Dv/tZ95IHg4N6HoQt+QhYqC#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBJEhtIO7C+QjT5YgwyF/I67C5Xkss8BYm9kOeYsH0CRnZjf2Gqob+02LKo6TaFASwXC/wTwZxaB6aCTT3Vww/Qw=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC60oKzgkchWABX2hE3YpiswmAIPVLq2Kkeq0mW+8P2G6HU16DlRafCL2QXQyelzr72tHAmxnxS0xOYiylO5TnqK1QT5GhdZOizSXwWGCOaLfsHh/B0uYoRlYR2pyhF3syD1kiEiM+gCtpRqGCup52Y01pwCrJTZiD+syxOqn/fqJOlFNgcn1/jp0SdkptWkwgndRtRQyw9ogSAGwYl+koKpvbjxzcKRqe2bH9YX/antCx8e23YlHDG/nxgRJ2IH9lMH8BdIjtzC3pDWpuMt3RNj+GRWaeUho3cZ6uo1fVoZxsRguMVsTIugvEnDfciOSB6W6XTIAF31P2uU3ekkQhnVDP3i+CAN6BrAf3QTgPT9woeXf/MmUlvL7192IeSKWoezec3XlZNxDjxXpT4WSlFtr/4rl9yjjV5UqKumRGh842xOoxWQUplR/vV3Ul5FskWSRHeNqX/RUol9nILAg4GB9rgeJlhUAoSe+AZfyrw148FDXIL+zXqt/ekJpgjmzc=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHhV5BacXTfRfRY0A5zUPLngYMcJOI7InW0RDqWlSveS#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBG1iHaQm9cAzLafyGqeqVzBe+HEeZOdAZSQ4RQ0MFLv5dIhQdpBjK1VdfbKsiSVz1ikdzAwZg277V0WPp5dBbOA=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDKGnWrygHsGYu/+5FOcSVM7Jv1IT65ecF1r0mljd8N6Wu3WpSPj0jAG7mTLk5ymgrmSlgt3AoWPnsti+GJCMDhH3BwnydeXTqbQ2hzh8qOXATHYHx3rEilf1fkdNp1ok7L9s/OS3oGaSuVyJyLP4Cl/OBSV7qsNwBk8eZH/ZQw/UiJ9/0d9F78BEYRoAEzYTUjuBZ49tbeMyuvSl7W2rKpl2T4fNHMJjfMxWLMLEjlVgGbz8WcbBPeeO/FoR+vlJrYlTx5SjTy2C5+IRG2YXVdlAhJj0v16T18ugTeRXS9ymPs1kLI28GH0JUhS2dwUV/bEU8UkFn9OjRtGpp50Lwkd/j11cM8TBjy/C1DgVv+1qfzaJD1QSOTHIsBfH92qWEpHwXmmMbx98rUcKFC6nKNT6RAKON2ZVjf948PWEKKjArrHwlmMTxpE8Xgt4wKHXZN3qzVuZ+hQUM7iKZ65BHqDzwDqNC5vN7PMdBYxW6rTRZ2735cGvVXHBgxltoT/3s=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIJN3C5jl1T0+5FDLkjbUV1VSdd2Q548KLDIPt+xYOxFA#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPc3K5YgavEVJ4/NtBe2c6qkIvysrxK+lz9B2x2DOYkT+UNQglNvH2YspNbvQHxbbrxFVS8BtLGKU4dRSe+BvjE=#012 create=True mode=0644 path=/tmp/ansible.wyiw46jb state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:33 np0005474864 python3.9[74313]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wyiw46jb' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:50:34 np0005474864 python3.9[74467]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.wyiw46jb state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:34 np0005474864 systemd[1]: session-18.scope: Deactivated successfully.
Oct  7 15:50:34 np0005474864 systemd[1]: session-18.scope: Consumed 4.350s CPU time.
Oct  7 15:50:34 np0005474864 systemd-logind[805]: Session 18 logged out. Waiting for processes to exit.
Oct  7 15:50:34 np0005474864 systemd-logind[805]: Removed session 18.
Oct  7 15:50:39 np0005474864 systemd-logind[805]: New session 19 of user zuul.
Oct  7 15:50:39 np0005474864 systemd[1]: Started Session 19 of User zuul.
Oct  7 15:50:40 np0005474864 python3.9[74645]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:50:42 np0005474864 python3.9[74801]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  7 15:50:43 np0005474864 python3.9[74955]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:50:44 np0005474864 python3.9[75108]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:50:45 np0005474864 python3.9[75261]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:50:45 np0005474864 python3.9[75415]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:50:46 np0005474864 python3.9[75570]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:50:47 np0005474864 systemd[1]: session-19.scope: Deactivated successfully.
Oct  7 15:50:47 np0005474864 systemd[1]: session-19.scope: Consumed 5.172s CPU time.
Oct  7 15:50:47 np0005474864 systemd-logind[805]: Session 19 logged out. Waiting for processes to exit.
Oct  7 15:50:47 np0005474864 systemd-logind[805]: Removed session 19.
Oct  7 15:50:53 np0005474864 systemd-logind[805]: New session 20 of user zuul.
Oct  7 15:50:53 np0005474864 systemd[1]: Started Session 20 of User zuul.
Oct  7 15:50:54 np0005474864 python3.9[75748]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:50:55 np0005474864 python3.9[75904]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:50:56 np0005474864 python3.9[75988]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Oct  7 15:50:58 np0005474864 python3.9[76139]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:50:59 np0005474864 python3.9[76290]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  7 15:51:00 np0005474864 python3.9[76440]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:51:01 np0005474864 python3.9[76590]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:51:02 np0005474864 systemd[1]: session-20.scope: Deactivated successfully.
Oct  7 15:51:02 np0005474864 systemd[1]: session-20.scope: Consumed 6.880s CPU time.
Oct  7 15:51:02 np0005474864 systemd-logind[805]: Session 20 logged out. Waiting for processes to exit.
Oct  7 15:51:02 np0005474864 systemd-logind[805]: Removed session 20.
Oct  7 15:51:07 np0005474864 systemd-logind[805]: New session 21 of user zuul.
Oct  7 15:51:07 np0005474864 systemd[1]: Started Session 21 of User zuul.
Oct  7 15:51:08 np0005474864 python3.9[76768]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:51:10 np0005474864 python3.9[76924]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:10 np0005474864 python3.9[77076]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:11 np0005474864 python3.9[77228]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:12 np0005474864 python3.9[77351]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866671.1847003-160-279960876440056/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=d715f46e3141f8a51528a203d0d64f14984b4efe backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:13 np0005474864 python3.9[77503]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:13 np0005474864 python3.9[77626]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866672.6842089-160-95997875859837/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=35c98902b8df5d9fc10dd9e151edaad9c168580a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:14 np0005474864 python3.9[77778]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:15 np0005474864 python3.9[77901]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866674.1208582-160-247039252934952/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=9d5e85d2837394a2fac23255202c1d343ec5220a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:16 np0005474864 python3.9[78053]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:16 np0005474864 python3.9[78205]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:17 np0005474864 python3.9[78357]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:18 np0005474864 python3.9[78480]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866677.2690575-346-238430923607421/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=4a2d28a0ea6f3f7f4eb5e0c39b5265f78f470da1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:19 np0005474864 python3.9[78632]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:19 np0005474864 python3.9[78755]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866678.7078614-346-192494920996941/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=e040544f6cda36ddc2febb0e00b02cfbc5a3bb0f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:20 np0005474864 python3.9[78907]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:21 np0005474864 python3.9[79030]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866680.136127-346-51310018988675/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=86f71c217a7db4cd9cbf4acfba5f0ac155f64af1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:22 np0005474864 python3.9[79182]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:22 np0005474864 python3.9[79334]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:23 np0005474864 python3.9[79486]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:24 np0005474864 python3.9[79609]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866683.196695-535-229885007654429/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=5107f268b440093730be1c32ae47a43962b85bdc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:25 np0005474864 python3.9[79761]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:25 np0005474864 python3.9[79884]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866684.5837512-535-90770699327265/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=d20f6b47b126a42245802d709b61226ae5d6a434 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:26 np0005474864 python3.9[80036]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:27 np0005474864 python3.9[80159]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866686.0016205-535-236405726725763/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=d4403a53c9f2d9e6c539bfb06bc548c498516b69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:28 np0005474864 python3.9[80311]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:28 np0005474864 python3.9[80463]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:29 np0005474864 python3.9[80615]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:30 np0005474864 python3.9[80738]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866689.2291594-720-228182976091462/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=36ebf3e2a31d5b9250ec7e7300f07e1aeb890290 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:31 np0005474864 python3.9[80890]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:31 np0005474864 python3.9[81013]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866690.6796315-720-54383458125342/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=d20f6b47b126a42245802d709b61226ae5d6a434 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:32 np0005474864 python3.9[81165]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:33 np0005474864 python3.9[81288]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866692.1439111-720-122986436680960/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=32525799978ad95f00e1d3667476845b826e6bdd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:34 np0005474864 python3.9[81440]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:35 np0005474864 python3.9[81592]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:36 np0005474864 python3.9[81715]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866695.0385902-934-266546546235318/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:37 np0005474864 python3.9[81867]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:37 np0005474864 python3.9[82019]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:38 np0005474864 python3.9[82142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866697.373306-1007-41618216357492/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:39 np0005474864 python3.9[82294]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:40 np0005474864 python3.9[82446]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:41 np0005474864 python3.9[82569]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866699.8379653-1080-102947784219238/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:42 np0005474864 python3.9[82721]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:42 np0005474864 python3.9[82873]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:43 np0005474864 python3.9[82996]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866702.3679938-1153-14758394864696/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:44 np0005474864 python3.9[83148]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:45 np0005474864 python3.9[83300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:46 np0005474864 python3.9[83423]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866704.7249298-1228-11125333525165/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:47 np0005474864 python3.9[83575]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:47 np0005474864 systemd[1]: packagekit.service: Deactivated successfully.
Oct  7 15:51:47 np0005474864 python3.9[83728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:48 np0005474864 python3.9[83851]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866707.23938-1305-193698280925540/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:49 np0005474864 python3.9[84003]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:51:50 np0005474864 python3.9[84155]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:51:51 np0005474864 python3.9[84278]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866709.6801305-1352-204558139224463/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=13dbff74fbaeb9060262c6a672f4253d8d1d5def backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:51:51 np0005474864 systemd[1]: session-21.scope: Deactivated successfully.
Oct  7 15:51:51 np0005474864 systemd[1]: session-21.scope: Consumed 35.546s CPU time.
Oct  7 15:51:51 np0005474864 systemd-logind[805]: Session 21 logged out. Waiting for processes to exit.
Oct  7 15:51:51 np0005474864 systemd-logind[805]: Removed session 21.
Oct  7 15:51:56 np0005474864 systemd-logind[805]: New session 22 of user zuul.
Oct  7 15:51:56 np0005474864 systemd[1]: Started Session 22 of User zuul.
Oct  7 15:51:58 np0005474864 python3.9[84456]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:51:59 np0005474864 python3.9[84612]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:00 np0005474864 python3.9[84764]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:01 np0005474864 python3.9[84914]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:52:02 np0005474864 python3.9[85066]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  7 15:52:04 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Oct  7 15:52:04 np0005474864 python3.9[85222]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:52:05 np0005474864 python3.9[85306]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:52:08 np0005474864 python3.9[85459]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:52:10 np0005474864 python3[85614]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Oct  7 15:52:11 np0005474864 python3.9[85766]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:12 np0005474864 python3.9[85918]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:12 np0005474864 python3.9[85996]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:13 np0005474864 python3.9[86148]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:14 np0005474864 python3.9[86226]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hzko1ndh recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:15 np0005474864 python3.9[86378]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:15 np0005474864 python3.9[86456]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:16 np0005474864 python3.9[86608]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:52:17 np0005474864 python3[86761]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  7 15:52:18 np0005474864 python3.9[86913]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:19 np0005474864 python3.9[87038]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866737.8971093-433-133576142967058/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:20 np0005474864 python3.9[87190]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:20 np0005474864 python3.9[87315]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866739.6341054-479-216043724410918/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:21 np0005474864 python3.9[87467]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:22 np0005474864 python3.9[87592]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866741.2092597-524-268851039207940/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:23 np0005474864 python3.9[87744]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:23 np0005474864 python3.9[87869]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866742.6678376-568-40500623849383/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:24 np0005474864 python3.9[88021]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:25 np0005474864 python3.9[88146]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759866744.2268963-613-216948989131697/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:26 np0005474864 python3.9[88298]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:27 np0005474864 python3.9[88450]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:52:28 np0005474864 python3.9[88605]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:29 np0005474864 python3.9[88757]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:52:30 np0005474864 python3.9[88910]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:52:31 np0005474864 python3.9[89064]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:52:31 np0005474864 python3.9[89219]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:33 np0005474864 python3.9[89369]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:52:34 np0005474864 python3.9[89522]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:3e:0a:d8:76:c8:90" external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:52:34 np0005474864 ovs-vsctl[89523]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:3e:0a:d8:76:c8:90 external_ids:ovn-encap-ip=172.19.0.101 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Oct  7 15:52:35 np0005474864 python3.9[89675]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:52:36 np0005474864 python3.9[89830]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:52:36 np0005474864 ovs-vsctl[89831]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Oct  7 15:52:37 np0005474864 python3.9[89981]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:52:38 np0005474864 python3.9[90135]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:38 np0005474864 python3.9[90287]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:39 np0005474864 python3.9[90365]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:40 np0005474864 python3.9[90517]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:40 np0005474864 python3.9[90595]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:41 np0005474864 python3.9[90747]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:42 np0005474864 python3.9[90899]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:42 np0005474864 python3.9[90977]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:43 np0005474864 python3.9[91129]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:44 np0005474864 python3.9[91207]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:45 np0005474864 python3.9[91359]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:52:45 np0005474864 systemd[1]: Reloading.
Oct  7 15:52:45 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:52:45 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:52:46 np0005474864 python3.9[91548]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:46 np0005474864 python3.9[91626]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:47 np0005474864 python3.9[91778]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:48 np0005474864 python3.9[91856]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:49 np0005474864 python3.9[92008]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:52:49 np0005474864 systemd[1]: Reloading.
Oct  7 15:52:49 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:52:49 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:52:49 np0005474864 systemd[1]: Starting Create netns directory...
Oct  7 15:52:49 np0005474864 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  7 15:52:49 np0005474864 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  7 15:52:49 np0005474864 systemd[1]: Finished Create netns directory.
Oct  7 15:52:50 np0005474864 python3.9[92202]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:51 np0005474864 python3.9[92354]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:51 np0005474864 python3.9[92477]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866770.7378845-1366-270954201316996/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:53 np0005474864 python3.9[92629]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:52:54 np0005474864 python3.9[92781]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:52:54 np0005474864 python3.9[92904]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866773.4242406-1441-267106249305218/.source.json _original_basename=.cvdrez82 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:55 np0005474864 python3.9[93056]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:52:58 np0005474864 python3.9[93483]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Oct  7 15:52:59 np0005474864 python3.9[93635]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 15:53:00 np0005474864 python3.9[93787]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  7 15:53:00 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:53:02 np0005474864 python3[93950]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 15:53:02 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:53:02 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:53:02 np0005474864 podman[93986]: 2025-10-07 19:53:02.426784057 +0000 UTC m=+0.064696260 container create a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 15:53:02 np0005474864 podman[93986]: 2025-10-07 19:53:02.393969649 +0000 UTC m=+0.031881862 image pull 70c92fb64e1eda6ef063d34e60e9a541e44edbaa51e757e8304331202c76a3a7 quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  7 15:53:02 np0005474864 python3[93950]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Oct  7 15:53:03 np0005474864 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Oct  7 15:53:03 np0005474864 python3.9[94176]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:53:04 np0005474864 python3.9[94330]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:53:04 np0005474864 python3.9[94406]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:53:05 np0005474864 python3.9[94557]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759866784.9657154-1705-90252282023138/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:53:06 np0005474864 python3.9[94633]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 15:53:06 np0005474864 systemd[1]: Reloading.
Oct  7 15:53:06 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:53:06 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:53:07 np0005474864 python3.9[94744]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:53:07 np0005474864 systemd[1]: Reloading.
Oct  7 15:53:07 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:53:07 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:53:07 np0005474864 systemd[1]: Starting ovn_controller container...
Oct  7 15:53:07 np0005474864 systemd[1]: Created slice Virtual Machine and Container Slice.
Oct  7 15:53:07 np0005474864 systemd[1]: Started libcrun container.
Oct  7 15:53:07 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/47c283f896536cab9e69d35196af531ed54e457ce5827a26b09f31ccf26a22fd/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  7 15:53:07 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a.
Oct  7 15:53:07 np0005474864 podman[94785]: 2025-10-07 19:53:07.839726464 +0000 UTC m=+0.179014017 container init a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 15:53:07 np0005474864 ovn_controller[94801]: + sudo -E kolla_set_configs
Oct  7 15:53:07 np0005474864 podman[94785]: 2025-10-07 19:53:07.874648962 +0000 UTC m=+0.213936525 container start a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 15:53:07 np0005474864 edpm-start-podman-container[94785]: ovn_controller
Oct  7 15:53:07 np0005474864 systemd[1]: Created slice User Slice of UID 0.
Oct  7 15:53:07 np0005474864 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  7 15:53:07 np0005474864 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  7 15:53:07 np0005474864 systemd[1]: Starting User Manager for UID 0...
Oct  7 15:53:08 np0005474864 edpm-start-podman-container[94784]: Creating additional drop-in dependency for "ovn_controller" (a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a)
Oct  7 15:53:08 np0005474864 podman[94808]: 2025-10-07 19:53:08.015984601 +0000 UTC m=+0.118037004 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller)
Oct  7 15:53:08 np0005474864 systemd[1]: a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a-2b21796d1633e3a2.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 15:53:08 np0005474864 systemd[1]: a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a-2b21796d1633e3a2.service: Failed with result 'exit-code'.
Oct  7 15:53:08 np0005474864 systemd[1]: Reloading.
Oct  7 15:53:08 np0005474864 systemd[94834]: Queued start job for default target Main User Target.
Oct  7 15:53:08 np0005474864 systemd[94834]: Created slice User Application Slice.
Oct  7 15:53:08 np0005474864 systemd[94834]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  7 15:53:08 np0005474864 systemd[94834]: Started Daily Cleanup of User's Temporary Directories.
Oct  7 15:53:08 np0005474864 systemd[94834]: Reached target Paths.
Oct  7 15:53:08 np0005474864 systemd[94834]: Reached target Timers.
Oct  7 15:53:08 np0005474864 systemd[94834]: Starting D-Bus User Message Bus Socket...
Oct  7 15:53:08 np0005474864 systemd[94834]: Starting Create User's Volatile Files and Directories...
Oct  7 15:53:08 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:53:08 np0005474864 systemd[94834]: Listening on D-Bus User Message Bus Socket.
Oct  7 15:53:08 np0005474864 systemd[94834]: Finished Create User's Volatile Files and Directories.
Oct  7 15:53:08 np0005474864 systemd[94834]: Reached target Sockets.
Oct  7 15:53:08 np0005474864 systemd[94834]: Reached target Basic System.
Oct  7 15:53:08 np0005474864 systemd[94834]: Reached target Main User Target.
Oct  7 15:53:08 np0005474864 systemd[94834]: Startup finished in 145ms.
Oct  7 15:53:08 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:53:08 np0005474864 systemd[1]: Started User Manager for UID 0.
Oct  7 15:53:08 np0005474864 systemd[1]: Started ovn_controller container.
Oct  7 15:53:08 np0005474864 systemd[1]: Started Session c1 of User root.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: INFO:__main__:Validating config file
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: INFO:__main__:Writing out command to execute
Oct  7 15:53:08 np0005474864 systemd[1]: session-c1.scope: Deactivated successfully.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: ++ cat /run_command
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + ARGS=
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + sudo kolla_copy_cacerts
Oct  7 15:53:08 np0005474864 systemd[1]: Started Session c2 of User root.
Oct  7 15:53:08 np0005474864 systemd[1]: session-c2.scope: Deactivated successfully.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + [[ ! -n '' ]]
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + . kolla_extend_start
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + umask 0022
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00003|main|INFO|OVN internal version is : [24.03.7-20.33.0-76.8]
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5017] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5030] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5056] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5065] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5073] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  7 15:53:08 np0005474864 kernel: br-int: entered promiscuous mode
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00014|main|INFO|OVS feature set changed, force recompute.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00018|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00019|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00021|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00022|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00023|main|INFO|OVS feature set changed, force recompute.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00024|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00001|statctrl(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00002|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00003|rconn(ovn_statctrl2)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  7 15:53:08 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:08Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5338] manager: (ovn-67fe0b-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5350] manager: (ovn-924940-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/18)
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5359] manager: (ovn-0be8cc-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Oct  7 15:53:08 np0005474864 kernel: genev_sys_6081: entered promiscuous mode
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5660] device (genev_sys_6081): carrier: link connected
Oct  7 15:53:08 np0005474864 NetworkManager[51631]: <info>  [1759866788.5666] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/20)
Oct  7 15:53:08 np0005474864 systemd-udevd[94942]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:53:08 np0005474864 systemd-udevd[94950]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:53:09 np0005474864 python3.9[95068]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:53:09 np0005474864 ovs-vsctl[95069]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Oct  7 15:53:10 np0005474864 python3.9[95221]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:53:10 np0005474864 ovs-vsctl[95223]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Oct  7 15:53:11 np0005474864 python3.9[95376]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:53:11 np0005474864 ovs-vsctl[95377]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Oct  7 15:53:12 np0005474864 systemd[1]: session-22.scope: Deactivated successfully.
Oct  7 15:53:12 np0005474864 systemd[1]: session-22.scope: Consumed 54.730s CPU time.
Oct  7 15:53:12 np0005474864 systemd-logind[805]: Session 22 logged out. Waiting for processes to exit.
Oct  7 15:53:12 np0005474864 systemd-logind[805]: Removed session 22.
Oct  7 15:53:17 np0005474864 systemd-logind[805]: New session 24 of user zuul.
Oct  7 15:53:17 np0005474864 systemd[1]: Started Session 24 of User zuul.
Oct  7 15:53:18 np0005474864 systemd[1]: Stopping User Manager for UID 0...
Oct  7 15:53:18 np0005474864 systemd[94834]: Activating special unit Exit the Session...
Oct  7 15:53:18 np0005474864 systemd[94834]: Stopped target Main User Target.
Oct  7 15:53:18 np0005474864 systemd[94834]: Stopped target Basic System.
Oct  7 15:53:18 np0005474864 systemd[94834]: Stopped target Paths.
Oct  7 15:53:18 np0005474864 systemd[94834]: Stopped target Sockets.
Oct  7 15:53:18 np0005474864 systemd[94834]: Stopped target Timers.
Oct  7 15:53:18 np0005474864 systemd[94834]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  7 15:53:18 np0005474864 systemd[94834]: Closed D-Bus User Message Bus Socket.
Oct  7 15:53:18 np0005474864 systemd[94834]: Stopped Create User's Volatile Files and Directories.
Oct  7 15:53:18 np0005474864 systemd[94834]: Removed slice User Application Slice.
Oct  7 15:53:18 np0005474864 systemd[94834]: Reached target Shutdown.
Oct  7 15:53:18 np0005474864 systemd[94834]: Finished Exit the Session.
Oct  7 15:53:18 np0005474864 systemd[94834]: Reached target Exit the Session.
Oct  7 15:53:18 np0005474864 systemd[1]: user@0.service: Deactivated successfully.
Oct  7 15:53:18 np0005474864 systemd[1]: Stopped User Manager for UID 0.
Oct  7 15:53:18 np0005474864 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  7 15:53:18 np0005474864 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  7 15:53:18 np0005474864 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  7 15:53:18 np0005474864 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  7 15:53:18 np0005474864 systemd[1]: Removed slice User Slice of UID 0.
Oct  7 15:53:19 np0005474864 python3.9[95557]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:53:20 np0005474864 python3.9[95713]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:21 np0005474864 python3.9[95865]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:22 np0005474864 python3.9[96017]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:23 np0005474864 python3.9[96169]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:23 np0005474864 python3.9[96321]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:24 np0005474864 python3.9[96471]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:53:26 np0005474864 python3.9[96623]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Oct  7 15:53:27 np0005474864 python3.9[96774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:28 np0005474864 python3.9[96895]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866806.9435322-220-227457448978131/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:29 np0005474864 python3.9[97045]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:29 np0005474864 python3.9[97166]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866808.6578724-265-232773144374424/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:31 np0005474864 python3.9[97318]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:53:32 np0005474864 python3.9[97402]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:53:34 np0005474864 python3.9[97555]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:53:35 np0005474864 python3.9[97708]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:36 np0005474864 python3.9[97829]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866815.205573-376-244288329193402/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:37 np0005474864 python3.9[97979]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:37 np0005474864 python3.9[98100]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866816.6987913-376-113323022556391/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:38 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:38Z|00025|memory|INFO|16896 kB peak resident set size after 29.9 seconds
Oct  7 15:53:38 np0005474864 ovn_controller[94801]: 2025-10-07T19:53:38Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:3
Oct  7 15:53:38 np0005474864 podman[98125]: 2025-10-07 19:53:38.49367449 +0000 UTC m=+0.177283005 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  7 15:53:39 np0005474864 python3.9[98277]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:40 np0005474864 python3.9[98398]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866818.8775008-508-117340985199280/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:41 np0005474864 python3.9[98548]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:41 np0005474864 python3.9[98669]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866820.7091475-508-49773009282084/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:42 np0005474864 python3.9[98819]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:53:43 np0005474864 python3.9[98973]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:44 np0005474864 python3.9[99125]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:45 np0005474864 python3.9[99203]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:45 np0005474864 python3.9[99355]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:46 np0005474864 python3.9[99433]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:47 np0005474864 python3.9[99585]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:53:48 np0005474864 python3.9[99737]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:48 np0005474864 python3.9[99815]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:53:49 np0005474864 python3.9[99967]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:50 np0005474864 python3.9[100045]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:53:51 np0005474864 python3.9[100197]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:53:51 np0005474864 systemd[1]: Reloading.
Oct  7 15:53:51 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:53:51 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:53:52 np0005474864 python3.9[100386]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:52 np0005474864 python3.9[100464]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:53:53 np0005474864 python3.9[100616]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:54 np0005474864 python3.9[100694]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:53:55 np0005474864 python3.9[100846]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:53:55 np0005474864 systemd[1]: Reloading.
Oct  7 15:53:55 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:53:55 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:53:55 np0005474864 systemd[1]: Starting Create netns directory...
Oct  7 15:53:55 np0005474864 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  7 15:53:55 np0005474864 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  7 15:53:55 np0005474864 systemd[1]: Finished Create netns directory.
Oct  7 15:53:56 np0005474864 python3.9[101039]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:57 np0005474864 python3.9[101191]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:53:58 np0005474864 python3.9[101314]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759866837.2320542-961-114516689196989/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:53:59 np0005474864 python3.9[101466]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:54:00 np0005474864 python3.9[101618]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:54:01 np0005474864 python3.9[101741]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759866839.8697386-1036-47598151538445/.source.json _original_basename=.jllf_qi9 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:01 np0005474864 python3.9[101893]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:04 np0005474864 python3.9[102320]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Oct  7 15:54:05 np0005474864 python3.9[102472]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 15:54:06 np0005474864 python3.9[102624]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  7 15:54:08 np0005474864 python3[102802]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 15:54:08 np0005474864 podman[102840]: 2025-10-07 19:54:08.822301331 +0000 UTC m=+0.072512035 container create a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 15:54:08 np0005474864 podman[102840]: 2025-10-07 19:54:08.781663553 +0000 UTC m=+0.031874307 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 15:54:08 np0005474864 python3[102802]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 15:54:09 np0005474864 podman[102903]: 2025-10-07 19:54:09.490753289 +0000 UTC m=+0.168166173 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Oct  7 15:54:09 np0005474864 python3.9[103055]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:54:10 np0005474864 python3.9[103209]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:11 np0005474864 python3.9[103285]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:54:12 np0005474864 python3.9[103436]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759866851.4837074-1300-80498682846679/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:12 np0005474864 python3.9[103512]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 15:54:12 np0005474864 systemd[1]: Reloading.
Oct  7 15:54:12 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:54:12 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:54:13 np0005474864 python3.9[103624]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:13 np0005474864 systemd[1]: Reloading.
Oct  7 15:54:13 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:54:13 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:54:14 np0005474864 systemd[1]: Starting ovn_metadata_agent container...
Oct  7 15:54:14 np0005474864 systemd[1]: Started libcrun container.
Oct  7 15:54:14 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e49af6d72afcbc87fc594e7f142e650dc37107488d3a0138af47522dc14de6b5/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Oct  7 15:54:14 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e49af6d72afcbc87fc594e7f142e650dc37107488d3a0138af47522dc14de6b5/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 15:54:14 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c.
Oct  7 15:54:14 np0005474864 podman[103665]: 2025-10-07 19:54:14.256183627 +0000 UTC m=+0.171787329 container init a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + sudo -E kolla_set_configs
Oct  7 15:54:14 np0005474864 podman[103665]: 2025-10-07 19:54:14.285044293 +0000 UTC m=+0.200647915 container start a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 15:54:14 np0005474864 edpm-start-podman-container[103665]: ovn_metadata_agent
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Validating config file
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Copying service configuration files
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Writing out command to execute
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /var/lib/neutron
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /var/lib/neutron/external
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: ++ cat /run_command
Oct  7 15:54:14 np0005474864 edpm-start-podman-container[103664]: Creating additional drop-in dependency for "ovn_metadata_agent" (a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c)
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + CMD=neutron-ovn-metadata-agent
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + ARGS=
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + sudo kolla_copy_cacerts
Oct  7 15:54:14 np0005474864 systemd[1]: Reloading.
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + [[ ! -n '' ]]
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + . kolla_extend_start
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: Running command: 'neutron-ovn-metadata-agent'
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + umask 0022
Oct  7 15:54:14 np0005474864 ovn_metadata_agent[103680]: + exec neutron-ovn-metadata-agent
Oct  7 15:54:14 np0005474864 podman[103687]: 2025-10-07 19:54:14.401114123 +0000 UTC m=+0.100021785 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 15:54:14 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:54:14 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:54:14 np0005474864 systemd[1]: Started ovn_metadata_agent container.
Oct  7 15:54:15 np0005474864 systemd[1]: session-24.scope: Deactivated successfully.
Oct  7 15:54:15 np0005474864 systemd[1]: session-24.scope: Consumed 41.439s CPU time.
Oct  7 15:54:15 np0005474864 systemd-logind[805]: Session 24 logged out. Waiting for processes to exit.
Oct  7 15:54:15 np0005474864 systemd-logind[805]: Removed session 24.
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.128 103685 INFO neutron.common.config [-] Logging enabled!#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.128 103685 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.128 103685 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.129 103685 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.130 103685 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.131 103685 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.132 103685 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.133 103685 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.134 103685 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.135 103685 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.136 103685 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.137 103685 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.138 103685 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.139 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.140 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.141 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.142 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.143 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.144 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.145 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.146 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.147 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.148 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.149 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.150 103685 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.151 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.152 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.153 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.154 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.155 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.156 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.157 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.158 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.159 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.160 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.160 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.160 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.160 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.160 103685 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.160 103685 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.170 103685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.171 103685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.171 103685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.171 103685 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.171 103685 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.182 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 2d917af9-e2c2-4b32-93ba-e5708271f327 (UUID: 2d917af9-e2c2-4b32-93ba-e5708271f327) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.208 103685 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.208 103685 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.208 103685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.208 103685 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.211 103685 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.219 103685 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.228 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '2d917af9-e2c2-4b32-93ba-e5708271f327'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], external_ids={}, name=2d917af9-e2c2-4b32-93ba-e5708271f327, nb_cfg_timestamp=1759866796524, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.229 103685 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f1427a62bb0>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.230 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.230 103685 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.230 103685 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.230 103685 INFO oslo_service.service [-] Starting 1 workers#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.234 103685 DEBUG oslo_service.service [-] Started child 103792 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.238 103685 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmpxosc3x5h/privsep.sock']#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.238 103792 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-438957'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.258 103792 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.258 103792 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.258 103792 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.262 103792 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.270 103792 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Oct  7 15:54:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.277 103792 INFO eventlet.wsgi.server [-] (103792) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Oct  7 15:54:16 np0005474864 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:17.011 103685 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:17.011 103685 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpxosc3x5h/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.832 103797 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.839 103797 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.843 103797 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:16.843 103797 INFO oslo.privsep.daemon [-] privsep daemon running as pid 103797#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:17.014 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[369bc44d-b07a-40a0-a400-045730b69e84]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:17.546 103797 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:17.547 103797 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 15:54:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:17.547 103797 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.055 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[ebf112ea-9de2-4069-8f0d-32fe296b2797]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.060 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, column=external_ids, values=({'neutron:ovn-metadata-id': '2d05402a-2d2e-58e6-a460-a27027d72415'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.072 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.077 103685 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.077 103685 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.077 103685 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.077 103685 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.077 103685 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.077 103685 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.078 103685 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.079 103685 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.080 103685 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.081 103685 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.081 103685 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.081 103685 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.081 103685 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.081 103685 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.081 103685 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.081 103685 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.082 103685 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.083 103685 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.084 103685 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.085 103685 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.086 103685 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.087 103685 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.088 103685 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.089 103685 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.090 103685 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.091 103685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.092 103685 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.093 103685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.094 103685 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.095 103685 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.096 103685 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.097 103685 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.098 103685 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.099 103685 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.100 103685 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.101 103685 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.102 103685 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.103 103685 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.104 103685 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.105 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.106 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.107 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.108 103685 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.109 103685 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 15:54:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:54:18.109 103685 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  7 15:54:20 np0005474864 systemd-logind[805]: New session 25 of user zuul.
Oct  7 15:54:20 np0005474864 systemd[1]: Started Session 25 of User zuul.
Oct  7 15:54:21 np0005474864 python3.9[103955]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:54:23 np0005474864 python3.9[104111]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:54:24 np0005474864 python3.9[104276]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 15:54:24 np0005474864 systemd[1]: Reloading.
Oct  7 15:54:24 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:54:24 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:54:25 np0005474864 python3.9[104461]: ansible-ansible.builtin.service_facts Invoked
Oct  7 15:54:26 np0005474864 network[104478]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 15:54:26 np0005474864 network[104479]: 'network-scripts' will be removed from distribution in near future.
Oct  7 15:54:26 np0005474864 network[104480]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 15:54:34 np0005474864 python3.9[104744]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:36 np0005474864 python3.9[104897]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:37 np0005474864 python3.9[105050]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:38 np0005474864 python3.9[105203]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:38 np0005474864 python3.9[105356]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:39 np0005474864 python3.9[105509]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:39 np0005474864 podman[105511]: 2025-10-07 19:54:39.991693213 +0000 UTC m=+0.134684451 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001)
Oct  7 15:54:40 np0005474864 python3.9[105687]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:54:42 np0005474864 python3.9[105840]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:42 np0005474864 python3.9[105992]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:43 np0005474864 python3.9[106144]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:44 np0005474864 python3.9[106296]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:44 np0005474864 podman[106420]: 2025-10-07 19:54:44.839148538 +0000 UTC m=+0.085046064 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 15:54:45 np0005474864 python3.9[106464]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:45 np0005474864 python3.9[106619]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:46 np0005474864 python3.9[106771]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:47 np0005474864 python3.9[106923]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:48 np0005474864 python3.9[107075]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:48 np0005474864 python3.9[107227]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:49 np0005474864 python3.9[107379]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:50 np0005474864 python3.9[107531]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:51 np0005474864 python3.9[107683]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:51 np0005474864 python3.9[107835]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:54:52 np0005474864 python3.9[107987]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:54:53 np0005474864 python3.9[108139]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  7 15:54:54 np0005474864 python3.9[108291]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 15:54:54 np0005474864 systemd[1]: Reloading.
Oct  7 15:54:54 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:54:54 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:54:55 np0005474864 python3.9[108479]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:54:56 np0005474864 python3.9[108632]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:54:57 np0005474864 python3.9[108785]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:54:58 np0005474864 python3.9[108938]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:54:58 np0005474864 python3.9[109091]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:54:59 np0005474864 python3.9[109244]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:55:00 np0005474864 python3.9[109397]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:55:01 np0005474864 python3.9[109550]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Oct  7 15:55:03 np0005474864 python3.9[109703]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  7 15:55:04 np0005474864 python3.9[109861]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  7 15:55:05 np0005474864 python3.9[110021]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 15:55:06 np0005474864 python3.9[110105]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 15:55:10 np0005474864 podman[110116]: 2025-10-07 19:55:10.438567294 +0000 UTC m=+0.130612483 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 15:55:15 np0005474864 podman[110144]: 2025-10-07 19:55:15.386682742 +0000 UTC m=+0.080322347 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  7 15:55:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:55:16.162 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 15:55:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:55:16.163 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 15:55:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:55:16.163 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 15:55:40 np0005474864 kernel: SELinux:  Converting 2752 SID table entries...
Oct  7 15:55:40 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:55:40 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:55:40 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:55:40 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:55:40 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:55:40 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:55:40 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:55:41 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=12 res=1
Oct  7 15:55:41 np0005474864 podman[110353]: 2025-10-07 19:55:41.440530962 +0000 UTC m=+0.120996277 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 15:55:46 np0005474864 podman[110380]: 2025-10-07 19:55:46.433674053 +0000 UTC m=+0.118988937 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_metadata_agent)
Oct  7 15:55:49 np0005474864 kernel: SELinux:  Converting 2752 SID table entries...
Oct  7 15:55:49 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:55:49 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:55:49 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:55:49 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:55:49 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:55:49 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:55:49 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:56:12 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Oct  7 15:56:12 np0005474864 podman[116454]: 2025-10-07 19:56:12.44705667 +0000 UTC m=+0.120678638 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible)
Oct  7 15:56:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:56:16.164 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 15:56:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:56:16.165 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 15:56:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:56:16.165 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 15:56:17 np0005474864 podman[119132]: 2025-10-07 19:56:17.361904827 +0000 UTC m=+0.057605205 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  7 15:56:43 np0005474864 podman[127199]: 2025-10-07 19:56:43.41306509 +0000 UTC m=+0.108545245 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller)
Oct  7 15:56:45 np0005474864 kernel: SELinux:  Converting 2753 SID table entries...
Oct  7 15:56:45 np0005474864 kernel: SELinux:  policy capability network_peer_controls=1
Oct  7 15:56:45 np0005474864 kernel: SELinux:  policy capability open_perms=1
Oct  7 15:56:45 np0005474864 kernel: SELinux:  policy capability extended_socket_class=1
Oct  7 15:56:45 np0005474864 kernel: SELinux:  policy capability always_check_network=0
Oct  7 15:56:45 np0005474864 kernel: SELinux:  policy capability cgroup_seclabel=1
Oct  7 15:56:45 np0005474864 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Oct  7 15:56:45 np0005474864 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Oct  7 15:56:46 np0005474864 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Oct  7 15:56:46 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Oct  7 15:56:46 np0005474864 dbus-broker-launch[766]: Noticed file-system modification, trigger reload.
Oct  7 15:56:47 np0005474864 podman[127257]: 2025-10-07 19:56:47.991054033 +0000 UTC m=+0.100967850 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 15:56:54 np0005474864 systemd[1]: Stopping OpenSSH server daemon...
Oct  7 15:56:54 np0005474864 systemd[1]: sshd.service: Deactivated successfully.
Oct  7 15:56:54 np0005474864 systemd[1]: Stopped OpenSSH server daemon.
Oct  7 15:56:54 np0005474864 systemd[1]: sshd.service: Consumed 4.971s CPU time, read 0B from disk, written 8.0K to disk.
Oct  7 15:56:54 np0005474864 systemd[1]: Stopped target sshd-keygen.target.
Oct  7 15:56:54 np0005474864 systemd[1]: Stopping sshd-keygen.target...
Oct  7 15:56:54 np0005474864 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  7 15:56:54 np0005474864 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  7 15:56:54 np0005474864 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Oct  7 15:56:54 np0005474864 systemd[1]: Reached target sshd-keygen.target.
Oct  7 15:56:54 np0005474864 systemd[1]: Starting OpenSSH server daemon...
Oct  7 15:56:54 np0005474864 systemd[1]: Started OpenSSH server daemon.
Oct  7 15:56:56 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 15:56:56 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 15:56:56 np0005474864 systemd[1]: Reloading.
Oct  7 15:56:56 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:56:56 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:56:57 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 15:57:00 np0005474864 systemd[1]: Starting PackageKit Daemon...
Oct  7 15:57:00 np0005474864 systemd[1]: Started PackageKit Daemon.
Oct  7 15:57:02 np0005474864 python3.9[133253]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:57:03 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:03 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:03 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:04 np0005474864 python3.9[135291]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:57:04 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:04 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:04 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:05 np0005474864 python3.9[136513]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:57:05 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:05 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:05 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:05 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 15:57:05 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 15:57:05 np0005474864 systemd[1]: man-db-cache-update.service: Consumed 11.402s CPU time.
Oct  7 15:57:05 np0005474864 systemd[1]: run-rc34942c40ad5498f8e31d4f3566851e7.service: Deactivated successfully.
Oct  7 15:57:06 np0005474864 python3.9[137225]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:57:06 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:06 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:06 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:07 np0005474864 python3.9[137415]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:07 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:08 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:08 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:08 np0005474864 python3.9[137606]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:09 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:09 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:09 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:10 np0005474864 python3.9[137795]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:10 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:10 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:10 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:11 np0005474864 python3.9[137985]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:12 np0005474864 python3.9[138140]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:12 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:12 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:12 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:13 np0005474864 python3.9[138330]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Oct  7 15:57:13 np0005474864 systemd[1]: Reloading.
Oct  7 15:57:13 np0005474864 podman[138332]: 2025-10-07 19:57:13.791614093 +0000 UTC m=+0.099462566 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 15:57:13 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:57:13 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:57:14 np0005474864 systemd[1]: Listening on libvirt proxy daemon socket.
Oct  7 15:57:14 np0005474864 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Oct  7 15:57:15 np0005474864 python3.9[138549]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:15 np0005474864 python3.9[138704]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:57:16.166 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 15:57:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:57:16.168 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 15:57:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:57:16.168 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 15:57:16 np0005474864 python3.9[138859]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:17 np0005474864 python3.9[139014]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:18 np0005474864 podman[139141]: 2025-10-07 19:57:18.382790696 +0000 UTC m=+0.094991988 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent)
Oct  7 15:57:18 np0005474864 python3.9[139184]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:19 np0005474864 python3.9[139343]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:20 np0005474864 python3.9[139498]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:21 np0005474864 python3.9[139653]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:22 np0005474864 python3.9[139808]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:23 np0005474864 python3.9[139963]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:24 np0005474864 python3.9[140118]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:24 np0005474864 python3.9[140273]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:25 np0005474864 python3.9[140428]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:26 np0005474864 python3.9[140583]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Oct  7 15:57:30 np0005474864 python3.9[140738]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:57:30 np0005474864 python3.9[140890]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:57:31 np0005474864 python3.9[141042]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:57:32 np0005474864 python3.9[141194]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:57:33 np0005474864 python3.9[141346]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:57:33 np0005474864 python3.9[141498]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:57:34 np0005474864 python3.9[141650]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:35 np0005474864 python3.9[141775]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867054.223059-1625-143040695248355/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:36 np0005474864 python3.9[141927]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:37 np0005474864 python3.9[142052]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867055.9827614-1625-278792833710420/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:38 np0005474864 python3.9[142204]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:38 np0005474864 python3.9[142329]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867057.4092996-1625-161234440613152/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:39 np0005474864 python3.9[142481]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:40 np0005474864 python3.9[142606]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867058.8915136-1625-53774298935694/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:40 np0005474864 python3.9[142758]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:41 np0005474864 python3.9[142883]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867060.3370442-1625-77105445484917/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:42 np0005474864 python3.9[143035]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:42 np0005474864 python3.9[143160]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867061.5375447-1625-168776103262778/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:43 np0005474864 python3.9[143312]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:44 np0005474864 python3.9[143435]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867062.9332023-1625-249431408523720/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:44 np0005474864 podman[143436]: 2025-10-07 19:57:44.245573384 +0000 UTC m=+0.107023014 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  7 15:57:44 np0005474864 python3.9[143614]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:45 np0005474864 python3.9[143739]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1759867064.2860866-1625-990090542383/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:47 np0005474864 python3.9[143891]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Oct  7 15:57:48 np0005474864 podman[144016]: 2025-10-07 19:57:48.572913752 +0000 UTC m=+0.060779496 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 15:57:48 np0005474864 python3.9[144059]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:49 np0005474864 python3.9[144215]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:50 np0005474864 python3.9[144367]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:51 np0005474864 python3.9[144519]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:51 np0005474864 python3.9[144671]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:52 np0005474864 python3.9[144823]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:53 np0005474864 python3.9[144975]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:53 np0005474864 python3.9[145127]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:54 np0005474864 python3.9[145279]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:55 np0005474864 python3.9[145431]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:55 np0005474864 python3.9[145583]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:56 np0005474864 python3.9[145735]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:57 np0005474864 python3.9[145887]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:58 np0005474864 python3.9[146039]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:57:58 np0005474864 python3.9[146191]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:57:59 np0005474864 python3.9[146314]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867078.3741384-2287-12448016940646/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:00 np0005474864 python3.9[146466]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:00 np0005474864 python3.9[146589]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867079.7526286-2287-187676617128783/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:01 np0005474864 python3.9[146741]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:02 np0005474864 python3.9[146864]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867081.1257634-2287-222450843125691/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:03 np0005474864 python3.9[147016]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:03 np0005474864 python3.9[147139]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867082.4660437-2287-159289754147654/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:04 np0005474864 python3.9[147291]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:04 np0005474864 python3.9[147414]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867083.8350542-2287-228608136344754/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:05 np0005474864 python3.9[147566]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:06 np0005474864 python3.9[147689]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867085.1392536-2287-158204831326121/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:07 np0005474864 python3.9[147841]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:07 np0005474864 python3.9[147964]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867086.5061364-2287-220140155023530/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:08 np0005474864 python3.9[148116]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:08 np0005474864 python3.9[148239]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867087.7666135-2287-212582828005095/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:09 np0005474864 python3.9[148391]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:10 np0005474864 python3.9[148514]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867089.069501-2287-144901810039040/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:10 np0005474864 python3.9[148666]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:11 np0005474864 python3.9[148789]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867090.3810964-2287-46460539055216/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:12 np0005474864 python3.9[148941]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:12 np0005474864 python3.9[149064]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867091.699245-2287-253581134238232/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:13 np0005474864 python3.9[149216]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:14 np0005474864 python3.9[149339]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867093.1283417-2287-183867527670013/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:14 np0005474864 podman[149340]: 2025-10-07 19:58:14.463628317 +0000 UTC m=+0.155907885 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  7 15:58:15 np0005474864 python3.9[149518]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:15 np0005474864 python3.9[149641]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867094.6505153-2287-132355892986607/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:58:16.167 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 15:58:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:58:16.168 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 15:58:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:58:16.168 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 15:58:16 np0005474864 python3.9[149793]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:17 np0005474864 python3.9[149916]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867096.0802462-2287-272689626139488/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:18 np0005474864 podman[150040]: 2025-10-07 19:58:18.812565115 +0000 UTC m=+0.082259541 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 15:58:18 np0005474864 python3.9[150077]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:58:19 np0005474864 python3.9[150240]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Oct  7 15:58:21 np0005474864 dbus-broker-launch[777]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Oct  7 15:58:22 np0005474864 python3.9[150396]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:22 np0005474864 python3.9[150548]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:23 np0005474864 python3.9[150700]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:24 np0005474864 python3.9[150852]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:25 np0005474864 python3.9[151004]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:26 np0005474864 python3.9[151157]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:27 np0005474864 python3.9[151310]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:28 np0005474864 python3.9[151462]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:28 np0005474864 python3.9[151614]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:29 np0005474864 python3.9[151766]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:30 np0005474864 python3.9[151918]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:58:30 np0005474864 systemd[1]: Reloading.
Oct  7 15:58:30 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:58:30 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:58:30 np0005474864 systemd[1]: Starting libvirt logging daemon socket...
Oct  7 15:58:30 np0005474864 systemd[1]: Listening on libvirt logging daemon socket.
Oct  7 15:58:30 np0005474864 systemd[1]: Starting libvirt logging daemon admin socket...
Oct  7 15:58:30 np0005474864 systemd[1]: Listening on libvirt logging daemon admin socket.
Oct  7 15:58:30 np0005474864 systemd[1]: Starting libvirt logging daemon...
Oct  7 15:58:31 np0005474864 systemd[1]: Started libvirt logging daemon.
Oct  7 15:58:31 np0005474864 python3.9[152111]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:58:31 np0005474864 systemd[1]: Reloading.
Oct  7 15:58:31 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:58:31 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:58:32 np0005474864 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Oct  7 15:58:32 np0005474864 systemd[1]: Starting libvirt nodedev daemon socket...
Oct  7 15:58:32 np0005474864 systemd[1]: Listening on libvirt nodedev daemon socket.
Oct  7 15:58:32 np0005474864 systemd[1]: Starting libvirt nodedev daemon admin socket...
Oct  7 15:58:32 np0005474864 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Oct  7 15:58:32 np0005474864 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Oct  7 15:58:32 np0005474864 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Oct  7 15:58:32 np0005474864 systemd[1]: Starting libvirt nodedev daemon...
Oct  7 15:58:32 np0005474864 systemd[1]: Started libvirt nodedev daemon.
Oct  7 15:58:32 np0005474864 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Oct  7 15:58:32 np0005474864 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Oct  7 15:58:32 np0005474864 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Oct  7 15:58:33 np0005474864 python3.9[152326]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:58:33 np0005474864 systemd[1]: Reloading.
Oct  7 15:58:33 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:58:33 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:58:33 np0005474864 systemd[1]: Starting libvirt proxy daemon admin socket...
Oct  7 15:58:33 np0005474864 systemd[1]: Starting libvirt proxy daemon read-only socket...
Oct  7 15:58:33 np0005474864 systemd[1]: Listening on libvirt proxy daemon admin socket.
Oct  7 15:58:33 np0005474864 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Oct  7 15:58:33 np0005474864 systemd[1]: Starting libvirt proxy daemon...
Oct  7 15:58:33 np0005474864 systemd[1]: Started libvirt proxy daemon.
Oct  7 15:58:33 np0005474864 setroubleshoot[152147]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 76c80980-0613-49da-ade8-fb1be7182bb9
Oct  7 15:58:33 np0005474864 setroubleshoot[152147]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  7 15:58:33 np0005474864 setroubleshoot[152147]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l 76c80980-0613-49da-ade8-fb1be7182bb9
Oct  7 15:58:33 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 15:58:33 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 15:58:33 np0005474864 setroubleshoot[152147]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Oct  7 15:58:34 np0005474864 python3.9[152545]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:58:34 np0005474864 systemd[1]: Reloading.
Oct  7 15:58:34 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:58:34 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:58:35 np0005474864 systemd[1]: Listening on libvirt locking daemon socket.
Oct  7 15:58:35 np0005474864 systemd[1]: Starting libvirt QEMU daemon socket...
Oct  7 15:58:35 np0005474864 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Oct  7 15:58:35 np0005474864 systemd[1]: Starting Virtual Machine and Container Registration Service...
Oct  7 15:58:35 np0005474864 systemd[1]: Listening on libvirt QEMU daemon socket.
Oct  7 15:58:35 np0005474864 systemd[1]: Starting libvirt QEMU daemon admin socket...
Oct  7 15:58:35 np0005474864 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Oct  7 15:58:35 np0005474864 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Oct  7 15:58:35 np0005474864 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Oct  7 15:58:35 np0005474864 systemd[1]: Started Virtual Machine and Container Registration Service.
Oct  7 15:58:35 np0005474864 systemd[1]: Starting libvirt QEMU daemon...
Oct  7 15:58:35 np0005474864 systemd[1]: Started libvirt QEMU daemon.
Oct  7 15:58:36 np0005474864 python3.9[152759]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 15:58:36 np0005474864 systemd[1]: Reloading.
Oct  7 15:58:36 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:58:36 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:58:36 np0005474864 systemd[1]: Starting libvirt secret daemon socket...
Oct  7 15:58:36 np0005474864 systemd[1]: Listening on libvirt secret daemon socket.
Oct  7 15:58:36 np0005474864 systemd[1]: Starting libvirt secret daemon admin socket...
Oct  7 15:58:36 np0005474864 systemd[1]: Starting libvirt secret daemon read-only socket...
Oct  7 15:58:36 np0005474864 systemd[1]: Listening on libvirt secret daemon admin socket.
Oct  7 15:58:36 np0005474864 systemd[1]: Listening on libvirt secret daemon read-only socket.
Oct  7 15:58:36 np0005474864 systemd[1]: Starting libvirt secret daemon...
Oct  7 15:58:36 np0005474864 systemd[1]: Started libvirt secret daemon.
Oct  7 15:58:38 np0005474864 python3.9[152969]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:38 np0005474864 python3.9[153122]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  7 15:58:40 np0005474864 python3.9[153274]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:40 np0005474864 python3.9[153397]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867119.5995264-3323-24190429997391/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:41 np0005474864 python3.9[153549]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:42 np0005474864 python3.9[153701]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:43 np0005474864 python3.9[153779]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:43 np0005474864 python3.9[153932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:43 np0005474864 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Oct  7 15:58:44 np0005474864 systemd[1]: setroubleshootd.service: Deactivated successfully.
Oct  7 15:58:44 np0005474864 python3.9[154011]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.hfhhs087 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:45 np0005474864 podman[154104]: 2025-10-07 19:58:45.108457191 +0000 UTC m=+0.128634146 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 15:58:45 np0005474864 python3.9[154189]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:45 np0005474864 python3.9[154267]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:46 np0005474864 python3.9[154419]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:58:47 np0005474864 python3[154572]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  7 15:58:48 np0005474864 python3.9[154724]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:49 np0005474864 podman[154774]: 2025-10-07 19:58:49.16905714 +0000 UTC m=+0.055876458 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  7 15:58:49 np0005474864 python3.9[154821]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:50 np0005474864 python3.9[154974]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:50 np0005474864 python3.9[155052]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:51 np0005474864 python3.9[155204]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:52 np0005474864 python3.9[155282]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:54 np0005474864 python3.9[155434]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:54 np0005474864 python3.9[155512]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:55 np0005474864 python3.9[155664]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:58:56 np0005474864 python3.9[155789]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867134.9192014-3699-149743805036766/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:57 np0005474864 python3.9[155941]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:58:57 np0005474864 python3.9[156093]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:58:58 np0005474864 python3.9[156248]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:00 np0005474864 python3.9[156400]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:59:00 np0005474864 python3.9[156553]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:59:01 np0005474864 python3.9[156707]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 15:59:02 np0005474864 python3.9[156862]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:03 np0005474864 python3.9[157014]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:04 np0005474864 python3.9[157137]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867143.060763-3913-208243232491732/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:05 np0005474864 python3.9[157289]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:05 np0005474864 python3.9[157412]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867144.5431507-3958-193691411308740/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:06 np0005474864 python3.9[157564]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:07 np0005474864 python3.9[157687]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867146.006217-4004-214553749960483/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:08 np0005474864 python3.9[157839]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:59:08 np0005474864 systemd[1]: Reloading.
Oct  7 15:59:08 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:59:08 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:59:08 np0005474864 systemd[1]: Reached target edpm_libvirt.target.
Oct  7 15:59:09 np0005474864 python3.9[158031]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Oct  7 15:59:09 np0005474864 systemd[1]: Reloading.
Oct  7 15:59:09 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:59:09 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:59:10 np0005474864 systemd[1]: Reloading.
Oct  7 15:59:10 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:59:10 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:59:10 np0005474864 systemd[1]: session-25.scope: Deactivated successfully.
Oct  7 15:59:10 np0005474864 systemd[1]: session-25.scope: Consumed 3min 40.498s CPU time.
Oct  7 15:59:10 np0005474864 systemd-logind[805]: Session 25 logged out. Waiting for processes to exit.
Oct  7 15:59:10 np0005474864 systemd-logind[805]: Removed session 25.
Oct  7 15:59:15 np0005474864 podman[158128]: 2025-10-07 19:59:15.430763766 +0000 UTC m=+0.111897657 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  7 15:59:16 np0005474864 systemd-logind[805]: New session 26 of user zuul.
Oct  7 15:59:16 np0005474864 systemd[1]: Started Session 26 of User zuul.
Oct  7 15:59:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:59:16.169 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 15:59:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:59:16.172 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 15:59:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 19:59:16.172 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 15:59:17 np0005474864 python3.9[158307]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 15:59:18 np0005474864 python3.9[158463]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:19 np0005474864 podman[158587]: 2025-10-07 19:59:19.381357857 +0000 UTC m=+0.081386583 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 15:59:19 np0005474864 python3.9[158630]: ansible-ansible.builtin.file Invoked with path=/etc/target setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:20 np0005474864 python3.9[158786]: ansible-ansible.builtin.file Invoked with path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:21 np0005474864 python3.9[158938]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  7 15:59:21 np0005474864 python3.9[159090]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/config-data/ansible-generated/iscsid setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:22 np0005474864 python3.9[159242]: ansible-ansible.builtin.stat Invoked with path=/lib/systemd/system/iscsid.socket follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:59:24 np0005474864 python3.9[159396]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsid.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:59:25 np0005474864 systemd[1]: Reloading.
Oct  7 15:59:25 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:59:25 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:59:26 np0005474864 python3.9[159585]: ansible-ansible.builtin.service_facts Invoked
Oct  7 15:59:26 np0005474864 network[159602]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 15:59:26 np0005474864 network[159603]: 'network-scripts' will be removed from distribution in near future.
Oct  7 15:59:26 np0005474864 network[159604]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 15:59:34 np0005474864 python3.9[159877]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iscsi-starter.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:59:34 np0005474864 systemd[1]: Reloading.
Oct  7 15:59:35 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:59:35 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:59:36 np0005474864 python3.9[160064]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:59:37 np0005474864 python3.9[160216]: ansible-containers.podman.podman_container Invoked with command=/usr/sbin/iscsi-iname detach=False image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified name=iscsid_config rm=True tty=True executable=podman state=started debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  7 15:59:37 np0005474864 podman[160252]: 2025-10-07 19:59:37.573323861 +0000 UTC m=+0.060150841 container create b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  7 15:59:37 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6038] manager: (podman0): new Bridge device (/org/freedesktop/NetworkManager/Devices/21)
Oct  7 15:59:37 np0005474864 kernel: podman0: port 1(veth0) entered blocking state
Oct  7 15:59:37 np0005474864 kernel: podman0: port 1(veth0) entered disabled state
Oct  7 15:59:37 np0005474864 kernel: veth0: entered allmulticast mode
Oct  7 15:59:37 np0005474864 kernel: veth0: entered promiscuous mode
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6178] manager: (veth0): new Veth device (/org/freedesktop/NetworkManager/Devices/22)
Oct  7 15:59:37 np0005474864 kernel: podman0: port 1(veth0) entered blocking state
Oct  7 15:59:37 np0005474864 kernel: podman0: port 1(veth0) entered forwarding state
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6203] device (veth0): carrier: link connected
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6207] device (podman0): carrier: link connected
Oct  7 15:59:37 np0005474864 systemd-udevd[160286]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:59:37 np0005474864 systemd-udevd[160283]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 15:59:37 np0005474864 podman[160252]: 2025-10-07 19:59:37.548221091 +0000 UTC m=+0.035048091 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6546] device (podman0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6552] device (podman0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6559] device (podman0): Activation: starting connection 'podman0' (14bdd2e6-6e05-4786-b168-a9fbfcab58ab)
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6561] device (podman0): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6567] device (podman0): state change: prepare -> config (reason 'none', managed-type: 'external')
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6568] device (podman0): state change: config -> ip-config (reason 'none', managed-type: 'external')
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6571] device (podman0): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Oct  7 15:59:37 np0005474864 systemd[1]: Starting Network Manager Script Dispatcher Service...
Oct  7 15:59:37 np0005474864 systemd[1]: Started Network Manager Script Dispatcher Service.
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6855] device (podman0): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6860] device (podman0): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Oct  7 15:59:37 np0005474864 NetworkManager[51631]: <info>  [1759867177.6868] device (podman0): Activation: successful, device activated.
Oct  7 15:59:37 np0005474864 systemd[1]: iscsi.service: Unit cannot be reloaded because it is inactive.
Oct  7 15:59:37 np0005474864 systemd[1]: Started libpod-conmon-b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586.scope.
Oct  7 15:59:37 np0005474864 systemd[1]: Started libcrun container.
Oct  7 15:59:37 np0005474864 podman[160252]: 2025-10-07 19:59:37.978316043 +0000 UTC m=+0.465143043 container init b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2)
Oct  7 15:59:37 np0005474864 podman[160252]: 2025-10-07 19:59:37.989293176 +0000 UTC m=+0.476120176 container start b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Oct  7 15:59:37 np0005474864 iscsid_config[160411]: iqn.1994-05.com.redhat:dd2bcee51a5d#015
Oct  7 15:59:37 np0005474864 systemd[1]: libpod-b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586.scope: Deactivated successfully.
Oct  7 15:59:37 np0005474864 podman[160252]: 2025-10-07 19:59:37.993110011 +0000 UTC m=+0.479936971 container attach b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 15:59:37 np0005474864 podman[160252]: 2025-10-07 19:59:37.99802541 +0000 UTC m=+0.484852390 container died b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  7 15:59:38 np0005474864 kernel: podman0: port 1(veth0) entered disabled state
Oct  7 15:59:38 np0005474864 kernel: veth0 (unregistering): left allmulticast mode
Oct  7 15:59:38 np0005474864 kernel: veth0 (unregistering): left promiscuous mode
Oct  7 15:59:38 np0005474864 kernel: podman0: port 1(veth0) entered disabled state
Oct  7 15:59:38 np0005474864 NetworkManager[51631]: <info>  [1759867178.0595] device (podman0): state change: activated -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 15:59:38 np0005474864 systemd[1]: run-netns-netns\x2dafc70d81\x2d0de3\x2d1c41\x2d907d\x2d07cfbf0f5f9a.mount: Deactivated successfully.
Oct  7 15:59:38 np0005474864 systemd[1]: var-lib-containers-storage-overlay-38baf33e9484ab946d297e5305595e4edef0258c97f21583a14846599614ec7b-merged.mount: Deactivated successfully.
Oct  7 15:59:38 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586-userdata-shm.mount: Deactivated successfully.
Oct  7 15:59:38 np0005474864 podman[160252]: 2025-10-07 19:59:38.436744403 +0000 UTC m=+0.923571413 container remove b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586 (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid_config, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 15:59:38 np0005474864 python3.9[160216]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman run --name iscsid_config --detach=False --rm --tty=True quay.io/podified-antelope-centos9/openstack-iscsid:current-podified /usr/sbin/iscsi-iname
Oct  7 15:59:38 np0005474864 systemd[1]: libpod-conmon-b1f0a3e80164c45ef8e03418681700435bda25e39392713975eacf5e9405b586.scope: Deactivated successfully.
Oct  7 15:59:38 np0005474864 python3.9[160216]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: Error generating systemd: #012DEPRECATED command:#012It is recommended to use Quadlets for running containers and pods under systemd.#012#012Please refer to podman-systemd.unit(5) for details.#012Error: iscsid_config does not refer to a container or pod: no pod with name or ID iscsid_config found: no such pod: no container with name or ID "iscsid_config" found: no such container
Oct  7 15:59:39 np0005474864 python3.9[160654]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:40 np0005474864 python3.9[160777]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867178.9733663-319-156931415112208/.source.iscsi _original_basename=.njpzquvw follow=False checksum=bc82cd4f0a4358cf0df7c81890a9e6a6f4f3ffdc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:41 np0005474864 python3.9[160929]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:42 np0005474864 python3.9[161079]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/iscsid.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 15:59:43 np0005474864 python3.9[161233]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:44 np0005474864 python3.9[161385]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:44 np0005474864 python3.9[161537]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:45 np0005474864 python3.9[161615]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:46 np0005474864 podman[161739]: 2025-10-07 19:59:46.166851721 +0000 UTC m=+0.161003842 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  7 15:59:46 np0005474864 python3.9[161782]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:46 np0005474864 python3.9[161870]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:47 np0005474864 python3.9[162022]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:48 np0005474864 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Oct  7 15:59:48 np0005474864 python3.9[162174]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:48 np0005474864 python3.9[162252]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:49 np0005474864 python3.9[162404]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:49 np0005474864 podman[162454]: 2025-10-07 19:59:49.890440544 +0000 UTC m=+0.075902737 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 15:59:50 np0005474864 python3.9[162501]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:51 np0005474864 python3.9[162653]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:59:51 np0005474864 systemd[1]: Reloading.
Oct  7 15:59:51 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:59:51 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:59:53 np0005474864 python3.9[162842]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:53 np0005474864 python3.9[162920]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:54 np0005474864 python3.9[163072]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:54 np0005474864 python3.9[163150]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 15:59:55 np0005474864 python3.9[163302]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 15:59:55 np0005474864 systemd[1]: Reloading.
Oct  7 15:59:55 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 15:59:55 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 15:59:56 np0005474864 systemd[1]: Starting Create netns directory...
Oct  7 15:59:56 np0005474864 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  7 15:59:56 np0005474864 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  7 15:59:56 np0005474864 systemd[1]: Finished Create netns directory.
Oct  7 15:59:57 np0005474864 python3.9[163495]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 15:59:58 np0005474864 python3.9[163647]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/iscsid/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 15:59:59 np0005474864 python3.9[163770]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/iscsid/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867197.916689-782-21777499240116/.source _original_basename=healthcheck follow=False checksum=2e1237e7fe015c809b173c52e24cfb87132f4344 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:00 np0005474864 python3.9[163922]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:01 np0005474864 python3.9[164074]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/iscsid.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:01 np0005474864 python3.9[164197]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/iscsid.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867200.6200125-856-89690021253460/.source.json _original_basename=.hnr8blf4 follow=False checksum=80e4f97460718c7e5c66b21ef8b846eba0e0dbc8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:02 np0005474864 python3.9[164349]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/iscsid state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:05 np0005474864 python3.9[164776]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/iscsid config_pattern=*.json debug=False
Oct  7 16:00:06 np0005474864 python3.9[164928]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:00:07 np0005474864 python3.9[165080]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  7 16:00:09 np0005474864 python3[165255]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/iscsid config_id=iscsid config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:00:09 np0005474864 podman[165290]: 2025-10-07 20:00:09.389332706 +0000 UTC m=+0.060277531 container create 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, config_id=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  7 16:00:09 np0005474864 podman[165290]: 2025-10-07 20:00:09.36298498 +0000 UTC m=+0.033929895 image pull 74877095db294c27659f24e7f86074178a6f28eee68561c30e3ce4d18519e09c quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  7 16:00:09 np0005474864 python3[165255]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name iscsid --conmon-pidfile /run/iscsid.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=iscsid --label container_name=iscsid --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run:/run --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:z --volume /etc/target:/etc/target:z --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /var/lib/openstack/healthchecks/iscsid:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-iscsid:current-podified
Oct  7 16:00:10 np0005474864 python3.9[165480]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:00:11 np0005474864 python3.9[165634]: ansible-file Invoked with path=/etc/systemd/system/edpm_iscsid.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:12 np0005474864 python3.9[165710]: ansible-stat Invoked with path=/etc/systemd/system/edpm_iscsid_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:00:12 np0005474864 python3.9[165861]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759867212.1574311-1120-156640504311548/source dest=/etc/systemd/system/edpm_iscsid.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:13 np0005474864 python3.9[165937]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:00:13 np0005474864 systemd[1]: Reloading.
Oct  7 16:00:13 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:00:13 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:00:14 np0005474864 python3.9[166047]: ansible-systemd Invoked with state=restarted name=edpm_iscsid.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:00:14 np0005474864 systemd[1]: Reloading.
Oct  7 16:00:14 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:00:14 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:00:14 np0005474864 systemd[1]: Starting iscsid container...
Oct  7 16:00:15 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:00:15 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d316c256046cddc46eeed0860d66f530f3c0d5a66ec26a33af8aeba4d7a18cf1/merged/etc/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  7 16:00:15 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d316c256046cddc46eeed0860d66f530f3c0d5a66ec26a33af8aeba4d7a18cf1/merged/etc/target supports timestamps until 2038 (0x7fffffff)
Oct  7 16:00:15 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d316c256046cddc46eeed0860d66f530f3c0d5a66ec26a33af8aeba4d7a18cf1/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  7 16:00:15 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b.
Oct  7 16:00:15 np0005474864 podman[166086]: 2025-10-07 20:00:15.098629884 +0000 UTC m=+0.134832670 container init 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct  7 16:00:15 np0005474864 iscsid[166102]: + sudo -E kolla_set_configs
Oct  7 16:00:15 np0005474864 podman[166086]: 2025-10-07 20:00:15.128378908 +0000 UTC m=+0.164581744 container start 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:00:15 np0005474864 podman[166086]: iscsid
Oct  7 16:00:15 np0005474864 systemd[1]: Started iscsid container.
Oct  7 16:00:15 np0005474864 systemd[1]: Created slice User Slice of UID 0.
Oct  7 16:00:15 np0005474864 systemd[1]: Starting User Runtime Directory /run/user/0...
Oct  7 16:00:15 np0005474864 systemd[1]: Finished User Runtime Directory /run/user/0.
Oct  7 16:00:15 np0005474864 systemd[1]: Starting User Manager for UID 0...
Oct  7 16:00:15 np0005474864 podman[166109]: 2025-10-07 20:00:15.235690347 +0000 UTC m=+0.095364538 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=iscsid, org.label-schema.build-date=20251001)
Oct  7 16:00:15 np0005474864 systemd[1]: 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b-5f60da3fb43f9008.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:00:15 np0005474864 systemd[1]: 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b-5f60da3fb43f9008.service: Failed with result 'exit-code'.
Oct  7 16:00:15 np0005474864 systemd[166128]: Queued start job for default target Main User Target.
Oct  7 16:00:15 np0005474864 systemd[166128]: Created slice User Application Slice.
Oct  7 16:00:15 np0005474864 systemd[166128]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Oct  7 16:00:15 np0005474864 systemd[166128]: Started Daily Cleanup of User's Temporary Directories.
Oct  7 16:00:15 np0005474864 systemd[166128]: Reached target Paths.
Oct  7 16:00:15 np0005474864 systemd[166128]: Reached target Timers.
Oct  7 16:00:15 np0005474864 systemd[166128]: Starting D-Bus User Message Bus Socket...
Oct  7 16:00:15 np0005474864 systemd[166128]: Starting Create User's Volatile Files and Directories...
Oct  7 16:00:15 np0005474864 systemd[166128]: Listening on D-Bus User Message Bus Socket.
Oct  7 16:00:15 np0005474864 systemd[166128]: Reached target Sockets.
Oct  7 16:00:15 np0005474864 systemd[166128]: Finished Create User's Volatile Files and Directories.
Oct  7 16:00:15 np0005474864 systemd[166128]: Reached target Basic System.
Oct  7 16:00:15 np0005474864 systemd[166128]: Reached target Main User Target.
Oct  7 16:00:15 np0005474864 systemd[166128]: Startup finished in 175ms.
Oct  7 16:00:15 np0005474864 systemd[1]: Started User Manager for UID 0.
Oct  7 16:00:15 np0005474864 systemd[1]: Started Session c3 of User root.
Oct  7 16:00:15 np0005474864 iscsid[166102]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 16:00:15 np0005474864 iscsid[166102]: INFO:__main__:Validating config file
Oct  7 16:00:15 np0005474864 iscsid[166102]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 16:00:15 np0005474864 iscsid[166102]: INFO:__main__:Writing out command to execute
Oct  7 16:00:15 np0005474864 systemd[1]: session-c3.scope: Deactivated successfully.
Oct  7 16:00:15 np0005474864 iscsid[166102]: ++ cat /run_command
Oct  7 16:00:15 np0005474864 iscsid[166102]: + CMD='/usr/sbin/iscsid -f'
Oct  7 16:00:15 np0005474864 iscsid[166102]: + ARGS=
Oct  7 16:00:15 np0005474864 iscsid[166102]: + sudo kolla_copy_cacerts
Oct  7 16:00:15 np0005474864 systemd[1]: Started Session c4 of User root.
Oct  7 16:00:15 np0005474864 iscsid[166102]: Running command: '/usr/sbin/iscsid -f'
Oct  7 16:00:15 np0005474864 iscsid[166102]: + [[ ! -n '' ]]
Oct  7 16:00:15 np0005474864 iscsid[166102]: + . kolla_extend_start
Oct  7 16:00:15 np0005474864 iscsid[166102]: ++ [[ ! -f /etc/iscsi/initiatorname.iscsi ]]
Oct  7 16:00:15 np0005474864 iscsid[166102]: + echo 'Running command: '\''/usr/sbin/iscsid -f'\'''
Oct  7 16:00:15 np0005474864 iscsid[166102]: + umask 0022
Oct  7 16:00:15 np0005474864 iscsid[166102]: + exec /usr/sbin/iscsid -f
Oct  7 16:00:15 np0005474864 systemd[1]: session-c4.scope: Deactivated successfully.
Oct  7 16:00:15 np0005474864 kernel: Loading iSCSI transport class v2.0-870.
Oct  7 16:00:16 np0005474864 python3.9[166309]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.iscsid_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:00:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:00:16.170 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:00:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:00:16.171 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:00:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:00:16.172 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:00:16 np0005474864 podman[166334]: 2025-10-07 20:00:16.461483781 +0000 UTC m=+0.146381481 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  7 16:00:17 np0005474864 python3.9[166487]: ansible-ansible.builtin.file Invoked with path=/etc/iscsi/.iscsid_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:18 np0005474864 python3.9[166639]: ansible-ansible.builtin.service_facts Invoked
Oct  7 16:00:18 np0005474864 network[166656]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 16:00:18 np0005474864 network[166657]: 'network-scripts' will be removed from distribution in near future.
Oct  7 16:00:18 np0005474864 network[166658]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 16:00:20 np0005474864 podman[166694]: 2025-10-07 20:00:20.049510998 +0000 UTC m=+0.091522207 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:00:23 np0005474864 python3.9[166951]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  7 16:00:24 np0005474864 python3.9[167103]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Oct  7 16:00:25 np0005474864 python3.9[167259]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:25 np0005474864 systemd[1]: Stopping User Manager for UID 0...
Oct  7 16:00:25 np0005474864 systemd[166128]: Activating special unit Exit the Session...
Oct  7 16:00:25 np0005474864 systemd[166128]: Stopped target Main User Target.
Oct  7 16:00:25 np0005474864 systemd[166128]: Stopped target Basic System.
Oct  7 16:00:25 np0005474864 systemd[166128]: Stopped target Paths.
Oct  7 16:00:25 np0005474864 systemd[166128]: Stopped target Sockets.
Oct  7 16:00:25 np0005474864 systemd[166128]: Stopped target Timers.
Oct  7 16:00:25 np0005474864 systemd[166128]: Stopped Daily Cleanup of User's Temporary Directories.
Oct  7 16:00:25 np0005474864 systemd[166128]: Closed D-Bus User Message Bus Socket.
Oct  7 16:00:25 np0005474864 systemd[166128]: Stopped Create User's Volatile Files and Directories.
Oct  7 16:00:25 np0005474864 systemd[166128]: Removed slice User Application Slice.
Oct  7 16:00:25 np0005474864 systemd[166128]: Reached target Shutdown.
Oct  7 16:00:25 np0005474864 systemd[166128]: Finished Exit the Session.
Oct  7 16:00:25 np0005474864 systemd[166128]: Reached target Exit the Session.
Oct  7 16:00:25 np0005474864 systemd[1]: user@0.service: Deactivated successfully.
Oct  7 16:00:25 np0005474864 systemd[1]: Stopped User Manager for UID 0.
Oct  7 16:00:25 np0005474864 systemd[1]: Stopping User Runtime Directory /run/user/0...
Oct  7 16:00:25 np0005474864 systemd[1]: run-user-0.mount: Deactivated successfully.
Oct  7 16:00:25 np0005474864 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Oct  7 16:00:25 np0005474864 systemd[1]: Stopped User Runtime Directory /run/user/0.
Oct  7 16:00:25 np0005474864 systemd[1]: Removed slice User Slice of UID 0.
Oct  7 16:00:26 np0005474864 python3.9[167384]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867225.0156658-1342-75954289981332/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:27 np0005474864 python3.9[167536]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:28 np0005474864 python3.9[167688]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:00:28 np0005474864 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  7 16:00:28 np0005474864 systemd[1]: Stopped Load Kernel Modules.
Oct  7 16:00:28 np0005474864 systemd[1]: Stopping Load Kernel Modules...
Oct  7 16:00:28 np0005474864 systemd[1]: Starting Load Kernel Modules...
Oct  7 16:00:28 np0005474864 systemd[1]: Finished Load Kernel Modules.
Oct  7 16:00:29 np0005474864 python3.9[167844]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:30 np0005474864 python3.9[167996]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:00:30 np0005474864 python3.9[168148]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:00:31 np0005474864 python3.9[168300]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:32 np0005474864 systemd[1]: virtnodedevd.service: Deactivated successfully.
Oct  7 16:00:32 np0005474864 python3.9[168424]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867231.2408807-1517-80289541679990/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:33 np0005474864 systemd[1]: virtproxyd.service: Deactivated successfully.
Oct  7 16:00:33 np0005474864 python3.9[168576]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:00:34 np0005474864 python3.9[168730]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:35 np0005474864 python3.9[168882]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:35 np0005474864 systemd[1]: virtqemud.service: Deactivated successfully.
Oct  7 16:00:36 np0005474864 python3.9[169035]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:37 np0005474864 systemd[1]: virtsecretd.service: Deactivated successfully.
Oct  7 16:00:37 np0005474864 python3.9[169188]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:37 np0005474864 python3.9[169340]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:38 np0005474864 python3.9[169492]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:39 np0005474864 python3.9[169644]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:40 np0005474864 python3.9[169796]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:00:41 np0005474864 python3.9[169950]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:42 np0005474864 python3.9[170102]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:43 np0005474864 python3.9[170254]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:43 np0005474864 python3.9[170332]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:44 np0005474864 python3.9[170484]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:44 np0005474864 python3.9[170562]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:45 np0005474864 podman[170610]: 2025-10-07 20:00:45.413812767 +0000 UTC m=+0.101906015 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:00:45 np0005474864 python3.9[170734]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:46 np0005474864 podman[170886]: 2025-10-07 20:00:46.66987453 +0000 UTC m=+0.118477961 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller)
Oct  7 16:00:46 np0005474864 python3.9[170887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:47 np0005474864 python3.9[170989]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:48 np0005474864 python3.9[171141]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:48 np0005474864 python3.9[171219]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:49 np0005474864 python3.9[171371]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:00:49 np0005474864 systemd[1]: Reloading.
Oct  7 16:00:49 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:00:49 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:00:50 np0005474864 podman[171450]: 2025-10-07 20:00:50.440449897 +0000 UTC m=+0.121239670 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:00:50 np0005474864 python3.9[171580]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:51 np0005474864 python3.9[171658]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:52 np0005474864 python3.9[171810]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:52 np0005474864 python3.9[171888]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:00:53 np0005474864 python3.9[172040]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:00:53 np0005474864 systemd[1]: Reloading.
Oct  7 16:00:54 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:00:54 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:00:54 np0005474864 systemd[1]: Starting Create netns directory...
Oct  7 16:00:54 np0005474864 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Oct  7 16:00:54 np0005474864 systemd[1]: netns-placeholder.service: Deactivated successfully.
Oct  7 16:00:54 np0005474864 systemd[1]: Finished Create netns directory.
Oct  7 16:00:55 np0005474864 python3.9[172234]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:56 np0005474864 python3.9[172386]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:56 np0005474864 python3.9[172509]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867255.594986-2137-254106524716179/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:57 np0005474864 python3.9[172661]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:00:58 np0005474864 python3.9[172813]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:00:59 np0005474864 python3.9[172936]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867258.185847-2212-51508374578249/.source.json _original_basename=.305qtj4w follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:00 np0005474864 python3.9[173088]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:02 np0005474864 python3.9[173530]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Oct  7 16:01:03 np0005474864 python3.9[173682]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:01:04 np0005474864 python3.9[173834]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Oct  7 16:01:06 np0005474864 python3[174012]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:01:06 np0005474864 podman[174049]: 2025-10-07 20:01:06.565464723 +0000 UTC m=+0.067666259 container create ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  7 16:01:06 np0005474864 podman[174049]: 2025-10-07 20:01:06.525915793 +0000 UTC m=+0.028117389 image pull f541ff382622bd8bc9ad206129d2a8e74c239ff4503fa3b67d3bdf6d5b50b511 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  7 16:01:06 np0005474864 python3[174012]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Oct  7 16:01:07 np0005474864 python3.9[174239]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:01:08 np0005474864 python3.9[174393]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:09 np0005474864 python3.9[174469]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:01:10 np0005474864 python3.9[174620]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759867269.2675676-2476-102016880909899/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:10 np0005474864 python3.9[174696]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:01:10 np0005474864 systemd[1]: Reloading.
Oct  7 16:01:10 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:01:10 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:01:11 np0005474864 python3.9[174806]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:11 np0005474864 systemd[1]: Reloading.
Oct  7 16:01:11 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:01:11 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:01:12 np0005474864 systemd[1]: Starting multipathd container...
Oct  7 16:01:12 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:01:12 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d70b36950c93e15e99fb783ef5b044c9d1794cc032fb92fd6a7d156991b6fba/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  7 16:01:12 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d70b36950c93e15e99fb783ef5b044c9d1794cc032fb92fd6a7d156991b6fba/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  7 16:01:12 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.
Oct  7 16:01:12 np0005474864 podman[174846]: 2025-10-07 20:01:12.247403645 +0000 UTC m=+0.125089267 container init ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  7 16:01:12 np0005474864 multipathd[174860]: + sudo -E kolla_set_configs
Oct  7 16:01:12 np0005474864 podman[174846]: 2025-10-07 20:01:12.268560511 +0000 UTC m=+0.146246113 container start ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:01:12 np0005474864 podman[174846]: multipathd
Oct  7 16:01:12 np0005474864 systemd[1]: Started multipathd container.
Oct  7 16:01:12 np0005474864 multipathd[174860]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 16:01:12 np0005474864 multipathd[174860]: INFO:__main__:Validating config file
Oct  7 16:01:12 np0005474864 multipathd[174860]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 16:01:12 np0005474864 multipathd[174860]: INFO:__main__:Writing out command to execute
Oct  7 16:01:12 np0005474864 multipathd[174860]: ++ cat /run_command
Oct  7 16:01:12 np0005474864 multipathd[174860]: + CMD='/usr/sbin/multipathd -d'
Oct  7 16:01:12 np0005474864 multipathd[174860]: + ARGS=
Oct  7 16:01:12 np0005474864 multipathd[174860]: + sudo kolla_copy_cacerts
Oct  7 16:01:12 np0005474864 multipathd[174860]: + [[ ! -n '' ]]
Oct  7 16:01:12 np0005474864 multipathd[174860]: + . kolla_extend_start
Oct  7 16:01:12 np0005474864 multipathd[174860]: Running command: '/usr/sbin/multipathd -d'
Oct  7 16:01:12 np0005474864 multipathd[174860]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  7 16:01:12 np0005474864 multipathd[174860]: + umask 0022
Oct  7 16:01:12 np0005474864 multipathd[174860]: + exec /usr/sbin/multipathd -d
Oct  7 16:01:12 np0005474864 podman[174867]: 2025-10-07 20:01:12.366795217 +0000 UTC m=+0.075791195 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  7 16:01:12 np0005474864 systemd[1]: ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434-1648a9cbcf7f6ad.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:01:12 np0005474864 systemd[1]: ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434-1648a9cbcf7f6ad.service: Failed with result 'exit-code'.
Oct  7 16:01:12 np0005474864 multipathd[174860]: 2908.324734 | --------start up--------
Oct  7 16:01:12 np0005474864 multipathd[174860]: 2908.324769 | read /etc/multipath.conf
Oct  7 16:01:12 np0005474864 multipathd[174860]: 2908.332608 | path checkers start up
Oct  7 16:01:13 np0005474864 python3.9[175049]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:01:14 np0005474864 python3.9[175203]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:01:15 np0005474864 python3.9[175368]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:01:15 np0005474864 systemd[1]: Stopping multipathd container...
Oct  7 16:01:15 np0005474864 multipathd[174860]: 2911.416224 | exit (signal)
Oct  7 16:01:15 np0005474864 multipathd[174860]: 2911.416291 | --------shut down-------
Oct  7 16:01:15 np0005474864 systemd[1]: libpod-ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.scope: Deactivated successfully.
Oct  7 16:01:15 np0005474864 podman[175372]: 2025-10-07 20:01:15.498336172 +0000 UTC m=+0.086264839 container died ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  7 16:01:15 np0005474864 systemd[1]: ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434-1648a9cbcf7f6ad.timer: Deactivated successfully.
Oct  7 16:01:15 np0005474864 systemd[1]: Stopped /usr/bin/podman healthcheck run ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.
Oct  7 16:01:15 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434-userdata-shm.mount: Deactivated successfully.
Oct  7 16:01:15 np0005474864 systemd[1]: var-lib-containers-storage-overlay-1d70b36950c93e15e99fb783ef5b044c9d1794cc032fb92fd6a7d156991b6fba-merged.mount: Deactivated successfully.
Oct  7 16:01:15 np0005474864 podman[175372]: 2025-10-07 20:01:15.542018282 +0000 UTC m=+0.129946919 container cleanup ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:01:15 np0005474864 podman[175372]: multipathd
Oct  7 16:01:15 np0005474864 podman[175388]: 2025-10-07 20:01:15.604439787 +0000 UTC m=+0.073991412 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:01:15 np0005474864 podman[175408]: multipathd
Oct  7 16:01:15 np0005474864 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Oct  7 16:01:15 np0005474864 systemd[1]: Stopped multipathd container.
Oct  7 16:01:15 np0005474864 systemd[1]: Starting multipathd container...
Oct  7 16:01:15 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:01:15 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d70b36950c93e15e99fb783ef5b044c9d1794cc032fb92fd6a7d156991b6fba/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  7 16:01:15 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1d70b36950c93e15e99fb783ef5b044c9d1794cc032fb92fd6a7d156991b6fba/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  7 16:01:15 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.
Oct  7 16:01:15 np0005474864 podman[175430]: 2025-10-07 20:01:15.804354318 +0000 UTC m=+0.158634412 container init ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:01:15 np0005474864 multipathd[175445]: + sudo -E kolla_set_configs
Oct  7 16:01:15 np0005474864 podman[175430]: 2025-10-07 20:01:15.846422911 +0000 UTC m=+0.200702975 container start ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 16:01:15 np0005474864 podman[175430]: multipathd
Oct  7 16:01:15 np0005474864 systemd[1]: Started multipathd container.
Oct  7 16:01:15 np0005474864 multipathd[175445]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 16:01:15 np0005474864 multipathd[175445]: INFO:__main__:Validating config file
Oct  7 16:01:15 np0005474864 multipathd[175445]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 16:01:15 np0005474864 multipathd[175445]: INFO:__main__:Writing out command to execute
Oct  7 16:01:15 np0005474864 multipathd[175445]: ++ cat /run_command
Oct  7 16:01:15 np0005474864 multipathd[175445]: + CMD='/usr/sbin/multipathd -d'
Oct  7 16:01:15 np0005474864 multipathd[175445]: + ARGS=
Oct  7 16:01:15 np0005474864 multipathd[175445]: + sudo kolla_copy_cacerts
Oct  7 16:01:15 np0005474864 podman[175453]: 2025-10-07 20:01:15.952046702 +0000 UTC m=+0.095387634 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:01:15 np0005474864 multipathd[175445]: Running command: '/usr/sbin/multipathd -d'
Oct  7 16:01:15 np0005474864 multipathd[175445]: + [[ ! -n '' ]]
Oct  7 16:01:15 np0005474864 multipathd[175445]: + . kolla_extend_start
Oct  7 16:01:15 np0005474864 multipathd[175445]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Oct  7 16:01:15 np0005474864 multipathd[175445]: + umask 0022
Oct  7 16:01:15 np0005474864 multipathd[175445]: + exec /usr/sbin/multipathd -d
Oct  7 16:01:15 np0005474864 systemd[1]: ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434-68774f6b7a1842f0.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:01:15 np0005474864 systemd[1]: ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434-68774f6b7a1842f0.service: Failed with result 'exit-code'.
Oct  7 16:01:15 np0005474864 multipathd[175445]: 2911.924656 | --------start up--------
Oct  7 16:01:15 np0005474864 multipathd[175445]: 2911.924674 | read /etc/multipath.conf
Oct  7 16:01:15 np0005474864 multipathd[175445]: 2911.931276 | path checkers start up
Oct  7 16:01:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:01:16.170 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:01:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:01:16.171 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:01:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:01:16.171 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:01:16 np0005474864 python3.9[175637]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:17 np0005474864 podman[175685]: 2025-10-07 20:01:17.409093584 +0000 UTC m=+0.103174231 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  7 16:01:17 np0005474864 python3.9[175815]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Oct  7 16:01:18 np0005474864 python3.9[175967]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Oct  7 16:01:18 np0005474864 kernel: Key type psk registered
Oct  7 16:01:19 np0005474864 python3.9[176128]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:01:20 np0005474864 python3.9[176251]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867278.87105-2717-138931071960716/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:20 np0005474864 podman[176375]: 2025-10-07 20:01:20.67483733 +0000 UTC m=+0.059208432 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:01:20 np0005474864 python3.9[176423]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:21 np0005474864 python3.9[176575]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:01:21 np0005474864 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Oct  7 16:01:21 np0005474864 systemd[1]: Stopped Load Kernel Modules.
Oct  7 16:01:21 np0005474864 systemd[1]: Stopping Load Kernel Modules...
Oct  7 16:01:22 np0005474864 systemd[1]: Starting Load Kernel Modules...
Oct  7 16:01:22 np0005474864 systemd[1]: Finished Load Kernel Modules.
Oct  7 16:01:23 np0005474864 python3.9[176731]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Oct  7 16:01:24 np0005474864 python3.9[176815]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Oct  7 16:01:30 np0005474864 systemd[1]: Reloading.
Oct  7 16:01:30 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:01:30 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:01:30 np0005474864 systemd[1]: Reloading.
Oct  7 16:01:30 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:01:30 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:01:31 np0005474864 systemd-logind[805]: Watching system buttons on /dev/input/event0 (Power Button)
Oct  7 16:01:31 np0005474864 systemd-logind[805]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Oct  7 16:01:31 np0005474864 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Oct  7 16:01:31 np0005474864 systemd[1]: Starting man-db-cache-update.service...
Oct  7 16:01:31 np0005474864 systemd[1]: Reloading.
Oct  7 16:01:31 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:01:31 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:01:31 np0005474864 systemd[1]: Queuing reload/restart jobs for marked units…
Oct  7 16:01:32 np0005474864 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Oct  7 16:01:32 np0005474864 systemd[1]: Finished man-db-cache-update.service.
Oct  7 16:01:32 np0005474864 systemd[1]: man-db-cache-update.service: Consumed 1.685s CPU time.
Oct  7 16:01:32 np0005474864 systemd[1]: run-r5a8934aee2e6414d8ac5107bbe7fe9db.service: Deactivated successfully.
Oct  7 16:01:33 np0005474864 python3.9[178268]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.iscsid_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:34 np0005474864 python3.9[178418]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 16:01:35 np0005474864 python3.9[178574]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:37 np0005474864 python3.9[178726]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:01:37 np0005474864 systemd[1]: Reloading.
Oct  7 16:01:37 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:01:37 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:01:38 np0005474864 python3.9[178911]: ansible-ansible.builtin.service_facts Invoked
Oct  7 16:01:38 np0005474864 network[178928]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 16:01:38 np0005474864 network[178929]: 'network-scripts' will be removed from distribution in near future.
Oct  7 16:01:38 np0005474864 network[178930]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 16:01:43 np0005474864 python3.9[179207]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:44 np0005474864 python3.9[179360]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:45 np0005474864 python3.9[179513]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:46 np0005474864 podman[179638]: 2025-10-07 20:01:46.017697263 +0000 UTC m=+0.092394227 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:01:46 np0005474864 podman[179683]: 2025-10-07 20:01:46.123582511 +0000 UTC m=+0.087867565 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  7 16:01:46 np0005474864 python3.9[179685]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:47 np0005474864 python3.9[179857]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:47 np0005474864 podman[179982]: 2025-10-07 20:01:47.843150645 +0000 UTC m=+0.109299148 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:01:48 np0005474864 python3.9[180030]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:49 np0005474864 python3.9[180189]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:49 np0005474864 python3.9[180342]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:01:51 np0005474864 podman[180368]: 2025-10-07 20:01:51.379645123 +0000 UTC m=+0.074180718 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:01:52 np0005474864 python3.9[180514]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:52 np0005474864 python3.9[180666]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:53 np0005474864 python3.9[180818]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:54 np0005474864 python3.9[180970]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:55 np0005474864 python3.9[181122]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:55 np0005474864 python3.9[181274]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:56 np0005474864 python3.9[181426]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:57 np0005474864 python3.9[181578]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:58 np0005474864 python3.9[181730]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:59 np0005474864 python3.9[181882]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:01:59 np0005474864 python3.9[182034]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:00 np0005474864 python3.9[182186]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:01 np0005474864 python3.9[182338]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:01 np0005474864 python3.9[182490]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:02 np0005474864 python3.9[182642]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:03 np0005474864 python3.9[182794]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:04 np0005474864 python3.9[182946]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:05 np0005474864 python3.9[183098]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  7 16:02:06 np0005474864 python3.9[183250]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:02:06 np0005474864 systemd[1]: Reloading.
Oct  7 16:02:06 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:02:06 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:02:07 np0005474864 python3.9[183437]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:08 np0005474864 python3.9[183590]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:09 np0005474864 python3.9[183743]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:10 np0005474864 python3.9[183896]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:10 np0005474864 python3.9[184049]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:11 np0005474864 python3.9[184202]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:12 np0005474864 python3.9[184355]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:13 np0005474864 python3.9[184508]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:02:15 np0005474864 python3.9[184661]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:16 np0005474864 python3.9[184813]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:02:16.171 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:02:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:02:16.172 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:02:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:02:16.172 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:02:16 np0005474864 podman[184814]: 2025-10-07 20:02:16.252847671 +0000 UTC m=+0.083027164 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:02:16 np0005474864 podman[184815]: 2025-10-07 20:02:16.284704685 +0000 UTC m=+0.108262264 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:02:16 np0005474864 python3.9[185005]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:17 np0005474864 python3.9[185157]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:18 np0005474864 podman[185158]: 2025-10-07 20:02:18.146282168 +0000 UTC m=+0.169440751 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:02:18 np0005474864 python3.9[185336]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:19 np0005474864 python3.9[185488]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:20 np0005474864 python3.9[185640]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:21 np0005474864 python3.9[185792]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:22 np0005474864 podman[185916]: 2025-10-07 20:02:22.007607778 +0000 UTC m=+0.094441091 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:02:22 np0005474864 python3.9[185961]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:22 np0005474864 python3.9[186115]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/iscsi setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:23 np0005474864 python3.9[186267]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:24 np0005474864 python3.9[186419]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:29 np0005474864 python3.9[186571]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Oct  7 16:02:30 np0005474864 python3.9[186724]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  7 16:02:31 np0005474864 python3.9[186882]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  7 16:02:32 np0005474864 systemd-logind[805]: New session 28 of user zuul.
Oct  7 16:02:32 np0005474864 systemd[1]: Started Session 28 of User zuul.
Oct  7 16:02:33 np0005474864 systemd[1]: session-28.scope: Deactivated successfully.
Oct  7 16:02:33 np0005474864 systemd-logind[805]: Session 28 logged out. Waiting for processes to exit.
Oct  7 16:02:33 np0005474864 systemd-logind[805]: Removed session 28.
Oct  7 16:02:33 np0005474864 python3.9[187068]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:34 np0005474864 python3.9[187189]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867353.2808926-4334-69430243904988/.source.json follow=False _original_basename=config.json.j2 checksum=2c2474b5f24ef7c9ed37f49680082593e0d1100b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:35 np0005474864 python3.9[187339]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:35 np0005474864 python3.9[187415]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:36 np0005474864 python3.9[187565]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:37 np0005474864 python3.9[187686]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867356.2270398-4334-124451084533279/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:38 np0005474864 python3.9[187836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:39 np0005474864 python3.9[187957]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867357.7666025-4334-13738503275797/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:39 np0005474864 python3.9[188107]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:40 np0005474864 python3.9[188228]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867359.2124326-4334-88473418676849/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:42 np0005474864 python3.9[188380]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:42 np0005474864 python3.9[188532]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:02:43 np0005474864 python3.9[188684]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:02:44 np0005474864 python3.9[188836]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:45 np0005474864 python3.9[188959]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1759867364.2225857-4613-100252315100474/.source _original_basename=.5j9z9pp_ follow=False checksum=143ffa0dbe9a87fc05cbdf437bae1370f18d6c66 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Oct  7 16:02:46 np0005474864 podman[189109]: 2025-10-07 20:02:46.407054009 +0000 UTC m=+0.092569409 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Oct  7 16:02:46 np0005474864 podman[189113]: 2025-10-07 20:02:46.42906601 +0000 UTC m=+0.096706874 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Oct  7 16:02:46 np0005474864 python3.9[189112]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:02:47 np0005474864 python3.9[189300]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:48 np0005474864 python3.9[189421]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867366.8539963-4691-19152211427667/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=f022386746472553146d29f689b545df70fa8a60 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:48 np0005474864 podman[189446]: 2025-10-07 20:02:48.407754123 +0000 UTC m=+0.101579059 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Oct  7 16:02:49 np0005474864 python3.9[189598]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:02:49 np0005474864 python3.9[189719]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867368.4458048-4736-275988505429141/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:02:50 np0005474864 python3.9[189871]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Oct  7 16:02:51 np0005474864 python3.9[190023]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:02:52 np0005474864 podman[190118]: 2025-10-07 20:02:52.43013847 +0000 UTC m=+0.108644405 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:02:52 np0005474864 python3[190194]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:02:53 np0005474864 podman[190232]: 2025-10-07 20:02:53.118020123 +0000 UTC m=+0.085329868 container create 902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=nova_compute_init, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Oct  7 16:02:53 np0005474864 podman[190232]: 2025-10-07 20:02:53.078734263 +0000 UTC m=+0.046044078 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  7 16:02:53 np0005474864 python3[190194]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Oct  7 16:02:54 np0005474864 python3.9[190422]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:02:55 np0005474864 python3.9[190576]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Oct  7 16:02:56 np0005474864 python3.9[190728]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:02:57 np0005474864 python3[190880]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:02:57 np0005474864 podman[190917]: 2025-10-07 20:02:57.968692309 +0000 UTC m=+0.064507040 container create 56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, container_name=nova_compute, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0)
Oct  7 16:02:57 np0005474864 podman[190917]: 2025-10-07 20:02:57.936457535 +0000 UTC m=+0.032272316 image pull 7ac362f4e836cf46e10a309acb4abf774df9481a1d6404c213437495cfb42f5d quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Oct  7 16:02:57 np0005474864 python3[190880]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi:z --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Oct  7 16:02:58 np0005474864 python3.9[191108]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:02:59 np0005474864 python3.9[191262]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:00 np0005474864 python3.9[191413]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759867379.8383625-5011-155029524293154/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:01 np0005474864 python3.9[191489]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:03:01 np0005474864 systemd[1]: Reloading.
Oct  7 16:03:01 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:03:01 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:03:02 np0005474864 python3.9[191601]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:03:02 np0005474864 systemd[1]: Reloading.
Oct  7 16:03:02 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:03:02 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:03:02 np0005474864 systemd[1]: Starting nova_compute container...
Oct  7 16:03:02 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:03:02 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:02 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:02 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:02 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:02 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:03 np0005474864 podman[191640]: 2025-10-07 20:03:03.027472559 +0000 UTC m=+0.153189741 container init 56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Oct  7 16:03:03 np0005474864 podman[191640]: 2025-10-07 20:03:03.038070163 +0000 UTC m=+0.163787295 container start 56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:03:03 np0005474864 podman[191640]: nova_compute
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + sudo -E kolla_set_configs
Oct  7 16:03:03 np0005474864 systemd[1]: Started nova_compute container.
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Validating config file
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying service configuration files
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Deleting /etc/ceph
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Creating directory /etc/ceph
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /etc/ceph
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Writing out command to execute
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:03 np0005474864 nova_compute[191655]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  7 16:03:03 np0005474864 nova_compute[191655]: ++ cat /run_command
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + CMD=nova-compute
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + ARGS=
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + sudo kolla_copy_cacerts
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + [[ ! -n '' ]]
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + . kolla_extend_start
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + echo 'Running command: '\''nova-compute'\'''
Oct  7 16:03:03 np0005474864 nova_compute[191655]: Running command: 'nova-compute'
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + umask 0022
Oct  7 16:03:03 np0005474864 nova_compute[191655]: + exec nova-compute
Oct  7 16:03:04 np0005474864 python3.9[191817]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.137 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.137 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.137 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.137 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.282 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.314 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.032s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:03:05 np0005474864 python3.9[191970]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.773 2 INFO nova.virt.driver [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.891 2 INFO nova.compute.provider_config [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.903 2 DEBUG oslo_concurrency.lockutils [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.903 2 DEBUG oslo_concurrency.lockutils [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.903 2 DEBUG oslo_concurrency.lockutils [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.903 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.904 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.904 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.904 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.904 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.904 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.904 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.905 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.906 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.906 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.906 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.906 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.906 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.906 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.907 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.907 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.907 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.907 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.907 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.908 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.908 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.908 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.908 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.908 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.908 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.909 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.909 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.909 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.909 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.909 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.909 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.910 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.910 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.910 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.910 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.910 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.910 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.911 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.911 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.911 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.911 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.911 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.911 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.911 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.912 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.913 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.913 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.913 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.913 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.913 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.913 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.913 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.914 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.914 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.914 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.914 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.914 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.914 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.914 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.915 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.915 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.915 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.915 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.915 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.915 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.916 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.916 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.916 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.916 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.916 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.916 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.916 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.917 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.917 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.917 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.917 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.917 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.917 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.917 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.918 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.919 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.919 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.919 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.919 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.919 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.919 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.919 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.920 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.921 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.921 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.921 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.921 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.921 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.921 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.921 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.922 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.923 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.923 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.923 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.923 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.923 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.923 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.923 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.924 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.924 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.924 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.924 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.924 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.924 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.924 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.925 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.925 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.925 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.925 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.925 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.925 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.925 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.926 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.926 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.926 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.926 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.926 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.926 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.927 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.927 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.927 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.927 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.927 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.927 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.928 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.928 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.928 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.928 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.928 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.929 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.929 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.929 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.929 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.929 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.929 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.930 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.930 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.930 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.930 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.930 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.930 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.930 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.931 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.931 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.931 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.931 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.931 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.931 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.931 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.932 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.932 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.932 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.932 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.932 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.932 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.932 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.933 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.933 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.933 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.933 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.933 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.933 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.934 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.934 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.934 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.934 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.934 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.934 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.934 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.935 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.935 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.935 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.935 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.935 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.935 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.936 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.937 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.937 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.937 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.937 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.937 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.937 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.937 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.938 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.938 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.938 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.938 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.938 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.938 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.939 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.939 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.939 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.939 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.939 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.939 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.939 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.940 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.940 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.940 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.940 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.940 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.940 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.940 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.941 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.941 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.941 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.941 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.941 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.941 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.941 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.942 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.942 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.942 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.942 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.942 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.942 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.942 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.943 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.943 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.943 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.943 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.943 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.943 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.943 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.944 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.944 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.944 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.944 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.944 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.944 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.944 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.945 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.945 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.945 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.945 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.945 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.945 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.945 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.946 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.946 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.946 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.946 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.946 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.946 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.946 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.947 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.947 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.947 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.947 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.947 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.947 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.947 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.948 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.948 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.948 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.948 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.948 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.948 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.948 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.949 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.949 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.949 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.949 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.949 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.949 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.949 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.950 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.950 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.950 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.950 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.950 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.950 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.950 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.951 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.952 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.952 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.952 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.952 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.952 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.952 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.952 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.953 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.953 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.953 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.953 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.953 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.953 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.954 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.954 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.954 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.954 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.954 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.954 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.954 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.955 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.955 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.955 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.955 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.955 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.955 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.955 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.956 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.956 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.956 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.956 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.956 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.956 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.957 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.957 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.957 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.957 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.957 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.957 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.957 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.958 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.958 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.958 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.958 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.958 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.958 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.959 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.959 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.959 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.959 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.959 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.959 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.959 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.960 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.960 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.960 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.960 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.960 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.960 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.960 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.961 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.961 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.961 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.961 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.961 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.961 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.961 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.962 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.962 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.962 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.962 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.962 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.962 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.962 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.963 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.963 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.963 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.963 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.963 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.963 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.963 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.964 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.964 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.964 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.964 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.964 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.964 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.964 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.965 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.965 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.965 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.965 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.965 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.965 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.966 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.967 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.967 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.967 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.967 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.967 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.967 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.967 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.968 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.968 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.968 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.968 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.968 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.968 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.968 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.969 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.969 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.969 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.969 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.969 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.969 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.969 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.970 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.970 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.970 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.970 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.970 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.970 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.971 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.972 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.972 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.972 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.972 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.972 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.972 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.973 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.973 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.973 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.973 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.973 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.973 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.973 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.974 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.974 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.974 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.974 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.974 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.974 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.975 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.975 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.975 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.975 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.975 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.976 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.976 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.976 2 WARNING oslo_config.cfg [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  7 16:03:05 np0005474864 nova_compute[191655]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  7 16:03:05 np0005474864 nova_compute[191655]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  7 16:03:05 np0005474864 nova_compute[191655]: and ``live_migration_inbound_addr`` respectively.
Oct  7 16:03:05 np0005474864 nova_compute[191655]: ).  Its value may be silently ignored in the future.#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.976 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.976 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.977 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.977 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.977 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.977 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.977 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.977 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.978 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.978 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.978 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.978 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.978 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.978 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.979 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.979 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.979 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.979 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.979 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.979 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.979 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.980 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.980 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.980 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.980 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.980 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.980 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.980 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.981 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.981 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.981 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.981 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.981 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.981 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.982 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.982 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.982 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.982 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.982 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.982 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.982 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.983 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.983 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.983 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.983 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.983 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.984 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.984 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.984 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.984 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.984 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.985 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.985 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.985 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.985 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.985 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.985 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.986 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.986 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.986 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.986 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.986 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.987 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.987 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.987 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.987 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.987 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.987 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.988 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.988 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.988 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.988 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.988 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.988 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.989 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.989 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.989 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.989 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.989 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.989 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.989 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.990 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.990 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.990 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.990 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.990 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.990 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.990 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.991 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.991 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.991 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.991 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.991 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.991 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.991 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.992 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.992 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.992 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.992 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.992 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.992 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.992 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.993 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.993 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.993 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.993 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.993 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.993 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.993 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.994 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.994 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.994 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.994 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.994 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.994 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.994 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.995 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.995 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.995 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.995 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.995 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.995 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.995 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.996 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.996 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.996 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.996 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.996 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.996 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.997 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.997 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.997 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.997 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.997 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.998 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.998 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.998 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.998 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.998 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.998 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.999 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.999 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.999 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.999 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:05 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.999 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:05.999 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.000 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.000 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.000 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.000 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.000 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.000 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.000 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.001 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.001 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.001 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.001 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.001 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.001 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.002 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.002 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.002 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.002 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.002 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.002 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.002 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.003 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.003 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.003 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.003 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.003 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.004 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.004 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.004 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.004 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.004 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.005 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.005 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.005 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.005 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.005 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.005 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.006 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.006 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.006 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.006 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.006 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.006 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.007 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.007 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.007 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.007 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.007 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.007 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.007 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.008 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.008 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.008 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.008 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.008 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.008 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.009 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.009 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.009 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.009 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.009 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.009 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.009 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.010 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.010 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.010 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.010 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.010 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.010 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.010 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.011 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.011 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.011 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.011 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.011 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.011 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.011 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.012 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.012 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.012 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.012 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.012 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.012 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.012 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.013 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.013 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.013 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.013 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.013 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.013 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.013 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.014 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.015 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.015 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.015 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.015 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.015 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.016 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.016 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.016 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.016 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.016 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.016 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.016 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.017 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.017 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.017 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.017 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.017 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.017 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.017 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.018 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.018 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.018 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.018 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.018 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.018 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.019 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.019 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.019 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.019 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.019 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.019 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.019 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.020 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.020 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.020 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.020 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.020 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.020 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.021 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.021 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.021 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.021 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.021 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.021 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.021 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.022 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.022 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.022 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.022 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.022 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.023 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.023 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.023 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.023 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.023 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.023 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.023 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.024 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.024 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.024 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.024 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.024 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.024 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.024 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.025 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.025 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.025 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.025 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.025 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.025 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.026 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.026 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.026 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.026 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.026 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.026 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.027 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.027 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.027 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.027 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.027 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.027 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.027 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.028 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.028 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.028 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.028 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.028 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.028 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.029 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.029 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.029 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.029 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.029 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.030 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.030 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.030 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.030 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.030 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.031 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.031 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.031 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.031 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.031 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.031 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.031 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.032 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.032 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.032 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.032 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.032 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.032 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.032 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.033 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.033 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.033 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.033 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.033 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.033 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.033 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.034 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.034 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.034 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.034 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.034 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.034 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.034 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.035 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.035 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.035 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.035 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.035 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.035 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.036 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.036 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.036 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.036 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.036 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.036 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.037 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.037 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.037 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.037 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.037 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.037 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.038 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.038 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.038 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.038 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.038 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.039 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.039 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.039 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.039 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.039 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.039 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.040 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.040 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.040 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.040 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.040 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.040 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.040 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.041 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.042 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.042 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.042 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.042 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.042 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.042 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.042 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.043 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.043 2 DEBUG oslo_service.service [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.044 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.058 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.059 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.059 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.060 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  7 16:03:06 np0005474864 systemd[1]: Starting libvirt QEMU daemon...
Oct  7 16:03:06 np0005474864 systemd[1]: Started libvirt QEMU daemon.
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.126 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f63a1f12520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.131 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f63a1f12520> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.132 2 INFO nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.149 2 WARNING nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  7 16:03:06 np0005474864 nova_compute[191655]: 2025-10-07 20:03:06.149 2 DEBUG nova.virt.libvirt.volume.mount [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  7 16:03:06 np0005474864 python3.9[192172]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.061 2 INFO nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Libvirt host capabilities <capabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <host>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <uuid>2e9c4e5f-0506-4565-be4b-95bb9b08ebdc</uuid>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <arch>x86_64</arch>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model>EPYC-Rome-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <vendor>AMD</vendor>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <microcode version='16777317'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <signature family='23' model='49' stepping='0'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='x2apic'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='tsc-deadline'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='osxsave'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='hypervisor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='tsc_adjust'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='spec-ctrl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='stibp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='arch-capabilities'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='cmp_legacy'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='topoext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='virt-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='lbrv'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='tsc-scale'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='vmcb-clean'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='pause-filter'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='pfthreshold'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='svme-addr-chk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='rdctl-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='skip-l1dfl-vmentry'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='mds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature name='pschange-mc-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <pages unit='KiB' size='4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <pages unit='KiB' size='2048'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <pages unit='KiB' size='1048576'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <power_management>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <suspend_mem/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <suspend_disk/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <suspend_hybrid/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </power_management>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <iommu support='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <migration_features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <live/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <uri_transports>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <uri_transport>tcp</uri_transport>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <uri_transport>rdma</uri_transport>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </uri_transports>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </migration_features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <topology>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <cells num='1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <cell id='0'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          <memory unit='KiB'>7864096</memory>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          <pages unit='KiB' size='4'>1966024</pages>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          <pages unit='KiB' size='2048'>0</pages>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          <distances>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <sibling id='0' value='10'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          </distances>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          <cpus num='8'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:          </cpus>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        </cell>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </cells>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </topology>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <cache>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </cache>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <secmodel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model>selinux</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <doi>0</doi>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </secmodel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <secmodel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model>dac</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <doi>0</doi>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </secmodel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </host>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <guest>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <os_type>hvm</os_type>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <arch name='i686'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <wordsize>32</wordsize>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <domain type='qemu'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <domain type='kvm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </arch>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <pae/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <nonpae/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <acpi default='on' toggle='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <apic default='on' toggle='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <cpuselection/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <deviceboot/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <disksnapshot default='on' toggle='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <externalSnapshot/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </guest>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <guest>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <os_type>hvm</os_type>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <arch name='x86_64'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <wordsize>64</wordsize>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <domain type='qemu'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <domain type='kvm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </arch>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <acpi default='on' toggle='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <apic default='on' toggle='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <cpuselection/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <deviceboot/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <disksnapshot default='on' toggle='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <externalSnapshot/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </guest>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 
Oct  7 16:03:07 np0005474864 nova_compute[191655]: </capabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: #033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.071 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.101 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  7 16:03:07 np0005474864 nova_compute[191655]: <domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <domain>kvm</domain>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <arch>i686</arch>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <vcpu max='4096'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <iothreads supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <os supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='firmware'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <loader supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>rom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pflash</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='readonly'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>yes</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='secure'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </loader>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </os>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='maximumMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <vendor>AMD</vendor>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='succor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='custom' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-128'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-256'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-512'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <memoryBacking supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='sourceType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>file</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>anonymous</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>memfd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </memoryBacking>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <disk supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='diskDevice'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>disk</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cdrom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>floppy</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>lun</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>fdc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>sata</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </disk>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <graphics supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vnc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egl-headless</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>dbus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </graphics>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <video supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='modelType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vga</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cirrus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>none</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>bochs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ramfb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </video>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hostdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='mode'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>subsystem</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='startupPolicy'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>mandatory</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>requisite</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>optional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='subsysType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pci</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='capsType'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='pciBackend'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hostdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <rng supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>random</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </rng>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <filesystem supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='driverType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>path</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>handle</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtiofs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </filesystem>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <tpm supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-tis</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-crb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emulator</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>external</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendVersion'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>2.0</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </tpm>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <redirdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </redirdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <channel supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pty</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>unix</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </channel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <crypto supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>qemu</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </crypto>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <interface supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>passt</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </interface>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <panic supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>isa</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>hyperv</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </panic>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <gic supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <genid supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backup supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <async-teardown supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <ps2 supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sev supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sgx supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hyperv supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='features'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>relaxed</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vapic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>spinlocks</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vpindex</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>runtime</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>synic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>stimer</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reset</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vendor_id</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>frequencies</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reenlightenment</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tlbflush</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ipi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>avic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emsr_bitmap</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>xmm_input</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hyperv>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <launchSecurity supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: </domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.108 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  7 16:03:07 np0005474864 nova_compute[191655]: <domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <domain>kvm</domain>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <arch>i686</arch>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <vcpu max='240'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <iothreads supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <os supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='firmware'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <loader supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>rom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pflash</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='readonly'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>yes</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='secure'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </loader>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </os>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='maximumMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <vendor>AMD</vendor>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='succor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='custom' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-128'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-256'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-512'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <memoryBacking supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='sourceType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>file</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>anonymous</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>memfd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </memoryBacking>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <disk supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='diskDevice'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>disk</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cdrom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>floppy</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>lun</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ide</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>fdc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>sata</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </disk>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <graphics supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vnc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egl-headless</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>dbus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </graphics>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <video supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='modelType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vga</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cirrus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>none</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>bochs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ramfb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </video>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hostdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='mode'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>subsystem</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='startupPolicy'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>mandatory</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>requisite</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>optional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='subsysType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pci</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='capsType'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='pciBackend'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hostdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <rng supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>random</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </rng>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <filesystem supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='driverType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>path</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>handle</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtiofs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </filesystem>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <tpm supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-tis</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-crb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emulator</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>external</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendVersion'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>2.0</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </tpm>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <redirdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </redirdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <channel supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pty</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>unix</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </channel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <crypto supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>qemu</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </crypto>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <interface supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>passt</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </interface>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <panic supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>isa</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>hyperv</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </panic>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <gic supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <genid supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backup supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <async-teardown supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <ps2 supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sev supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sgx supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hyperv supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='features'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>relaxed</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vapic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>spinlocks</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vpindex</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>runtime</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>synic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>stimer</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reset</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vendor_id</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>frequencies</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reenlightenment</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tlbflush</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ipi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>avic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emsr_bitmap</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>xmm_input</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hyperv>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <launchSecurity supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: </domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.145 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.150 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  7 16:03:07 np0005474864 nova_compute[191655]: <domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <domain>kvm</domain>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <arch>x86_64</arch>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <vcpu max='240'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <iothreads supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <os supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='firmware'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <loader supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>rom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pflash</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='readonly'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>yes</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='secure'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </loader>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </os>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='maximumMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <vendor>AMD</vendor>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='succor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='custom' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-128'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-256'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-512'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <memoryBacking supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='sourceType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>file</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>anonymous</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>memfd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </memoryBacking>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <disk supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='diskDevice'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>disk</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cdrom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>floppy</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>lun</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ide</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>fdc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>sata</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </disk>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <graphics supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vnc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egl-headless</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>dbus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </graphics>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <video supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='modelType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vga</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cirrus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>none</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>bochs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ramfb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </video>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hostdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='mode'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>subsystem</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='startupPolicy'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>mandatory</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>requisite</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>optional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='subsysType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pci</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='capsType'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='pciBackend'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hostdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <rng supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>random</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </rng>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <filesystem supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='driverType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>path</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>handle</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtiofs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </filesystem>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <tpm supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-tis</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-crb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emulator</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>external</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendVersion'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>2.0</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </tpm>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <redirdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </redirdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <channel supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pty</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>unix</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </channel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <crypto supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>qemu</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </crypto>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <interface supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>passt</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </interface>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <panic supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>isa</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>hyperv</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </panic>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <gic supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <genid supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backup supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <async-teardown supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <ps2 supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sev supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sgx supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hyperv supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='features'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>relaxed</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vapic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>spinlocks</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vpindex</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>runtime</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>synic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>stimer</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reset</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vendor_id</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>frequencies</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reenlightenment</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tlbflush</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ipi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>avic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emsr_bitmap</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>xmm_input</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hyperv>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <launchSecurity supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: </domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.223 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  7 16:03:07 np0005474864 nova_compute[191655]: <domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <domain>kvm</domain>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <arch>x86_64</arch>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <vcpu max='4096'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <iothreads supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <os supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='firmware'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>efi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <loader supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>rom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pflash</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='readonly'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>yes</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='secure'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>yes</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>no</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </loader>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </os>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='maximumMigratable'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>on</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>off</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <vendor>AMD</vendor>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='succor'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <mode name='custom' supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Denverton-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='auto-ibrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amd-psfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='stibp-always-on'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='EPYC-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-128'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-256'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx10-512'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='prefetchiti'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Haswell-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512er'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512pf'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fma4'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tbm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xop'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='amx-tile'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-bf16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-fp16'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bitalg'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrc'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fzrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='la57'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='taa-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xfd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ifma'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cmpccxadd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fbsdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='fsrs'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ibrs-all'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mcdt-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pbrsb-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='psdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='serialize'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vaes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='hle'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='rtm'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512bw'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512cd'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512dq'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512f'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='avx512vl'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='invpcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pcid'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='pku'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='mpx'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='core-capability'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='split-lock-detect'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='cldemote'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='erms'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='gfni'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdir64b'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='movdiri'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='xsaves'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='athlon-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='core2duo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='coreduo-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='n270-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='ss'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <blockers model='phenom-v1'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnow'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <feature name='3dnowext'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </blockers>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </mode>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <memoryBacking supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <enum name='sourceType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>file</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>anonymous</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <value>memfd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </memoryBacking>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <disk supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='diskDevice'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>disk</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cdrom</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>floppy</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>lun</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>fdc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>sata</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </disk>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <graphics supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vnc</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egl-headless</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>dbus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </graphics>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <video supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='modelType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vga</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>cirrus</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>none</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>bochs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ramfb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </video>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hostdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='mode'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>subsystem</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='startupPolicy'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>mandatory</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>requisite</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>optional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='subsysType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pci</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>scsi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='capsType'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='pciBackend'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hostdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <rng supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtio-non-transitional</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>random</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>egd</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </rng>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <filesystem supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='driverType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>path</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>handle</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>virtiofs</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </filesystem>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <tpm supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-tis</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tpm-crb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emulator</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>external</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendVersion'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>2.0</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </tpm>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <redirdev supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='bus'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>usb</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </redirdev>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <channel supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>pty</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>unix</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </channel>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <crypto supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='type'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>qemu</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendModel'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>builtin</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </crypto>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <interface supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='backendType'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>default</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>passt</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </interface>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <panic supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='model'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>isa</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>hyperv</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </panic>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </devices>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <gic supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <genid supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <backup supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <async-teardown supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <ps2 supported='yes'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sev supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <sgx supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <hyperv supported='yes'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      <enum name='features'>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>relaxed</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vapic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>spinlocks</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vpindex</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>runtime</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>synic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>stimer</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reset</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>vendor_id</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>frequencies</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>reenlightenment</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>tlbflush</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>ipi</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>avic</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>emsr_bitmap</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:        <value>xmm_input</value>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:      </enum>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    </hyperv>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:    <launchSecurity supported='no'/>
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  </features>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: </domainCapabilities>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.281 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.282 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.282 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.283 2 INFO nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Secure Boot support detected#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.285 2 INFO nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.286 2 INFO nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.301 2 DEBUG nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] cpu compare xml: <cpu match="exact">
Oct  7 16:03:07 np0005474864 nova_compute[191655]:  <model>Nehalem</model>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: </cpu>
Oct  7 16:03:07 np0005474864 nova_compute[191655]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.307 2 DEBUG nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.334 2 INFO nova.virt.node [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Determined node identity 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from /var/lib/nova/compute_id#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.351 2 WARNING nova.compute.manager [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Compute nodes ['63545c2e-7bb7-4b7a-9af2-ee768bda9cb4'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.378 2 INFO nova.compute.manager [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.410 2 WARNING nova.compute.manager [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.410 2 DEBUG oslo_concurrency.lockutils [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.411 2 DEBUG oslo_concurrency.lockutils [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.411 2 DEBUG oslo_concurrency.lockutils [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.411 2 DEBUG nova.compute.resource_tracker [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:03:07 np0005474864 python3.9[192337]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  7 16:03:07 np0005474864 systemd[1]: Starting libvirt nodedev daemon...
Oct  7 16:03:07 np0005474864 systemd[1]: Started libvirt nodedev daemon.
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.787 2 WARNING nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.788 2 DEBUG nova.compute.resource_tracker [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6191MB free_disk=73.66975021362305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.789 2 DEBUG oslo_concurrency.lockutils [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.789 2 DEBUG oslo_concurrency.lockutils [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.807 2 WARNING nova.compute.resource_tracker [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] No compute node record for compute-2.ctlplane.example.com:63545c2e-7bb7-4b7a-9af2-ee768bda9cb4: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 could not be found.#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.834 2 INFO nova.compute.resource_tracker [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.882 2 DEBUG nova.compute.resource_tracker [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:03:07 np0005474864 nova_compute[191655]: 2025-10-07 20:03:07.883 2 DEBUG nova.compute.resource_tracker [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:03:07 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.408 2 INFO nova.scheduler.client.report [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] [req-94ea819f-614e-47d2-9b8b-a1fed3ac982d] Created resource provider record via placement API for resource provider with UUID 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 and name compute-2.ctlplane.example.com.#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.481 2 DEBUG nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  7 16:03:08 np0005474864 nova_compute[191655]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.481 2 INFO nova.virt.libvirt.host [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.482 2 DEBUG nova.compute.provider_tree [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.483 2 DEBUG nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.487 2 DEBUG nova.virt.libvirt.driver [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Libvirt baseline CPU <cpu>
Oct  7 16:03:08 np0005474864 nova_compute[191655]:  <arch>x86_64</arch>
Oct  7 16:03:08 np0005474864 nova_compute[191655]:  <model>Nehalem</model>
Oct  7 16:03:08 np0005474864 nova_compute[191655]:  <vendor>AMD</vendor>
Oct  7 16:03:08 np0005474864 nova_compute[191655]:  <topology sockets="8" cores="1" threads="1"/>
Oct  7 16:03:08 np0005474864 nova_compute[191655]: </cpu>
Oct  7 16:03:08 np0005474864 nova_compute[191655]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.538 2 DEBUG nova.scheduler.client.report [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Updated inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.538 2 DEBUG nova.compute.provider_tree [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Updating resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.538 2 DEBUG nova.compute.provider_tree [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.628 2 DEBUG nova.compute.provider_tree [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Updating resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  7 16:03:08 np0005474864 python3.9[192534]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.650 2 DEBUG nova.compute.resource_tracker [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.650 2 DEBUG oslo_concurrency.lockutils [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.862s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.651 2 DEBUG nova.service [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  7 16:03:08 np0005474864 systemd[1]: Stopping nova_compute container...
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.734 2 DEBUG nova.service [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.735 2 DEBUG nova.servicegroup.drivers.db [None req-9624bc7c-2f11-4ee0-a0f5-9b5cd5937def - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.814 2 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.817 2 DEBUG oslo_concurrency.lockutils [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.817 2 DEBUG oslo_concurrency.lockutils [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:03:08 np0005474864 nova_compute[191655]: 2025-10-07 20:03:08.817 2 DEBUG oslo_concurrency.lockutils [None req-1ac07683-9f27-4dd7-bcff-e5c4cfadbc18 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:03:09 np0005474864 virtqemud[192092]: libvirt version: 10.10.0, package: 15.el9 (builder@centos.org, 2025-08-18-13:22:20, )
Oct  7 16:03:09 np0005474864 virtqemud[192092]: hostname: compute-2
Oct  7 16:03:09 np0005474864 virtqemud[192092]: End of file while reading data: Input/output error
Oct  7 16:03:09 np0005474864 systemd[1]: libpod-56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da.scope: Deactivated successfully.
Oct  7 16:03:09 np0005474864 systemd[1]: libpod-56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da.scope: Consumed 3.407s CPU time.
Oct  7 16:03:09 np0005474864 podman[192538]: 2025-10-07 20:03:09.324190891 +0000 UTC m=+0.585701170 container died 56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:03:09 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da-userdata-shm.mount: Deactivated successfully.
Oct  7 16:03:09 np0005474864 systemd[1]: var-lib-containers-storage-overlay-8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5-merged.mount: Deactivated successfully.
Oct  7 16:03:09 np0005474864 podman[192538]: 2025-10-07 20:03:09.385495362 +0000 UTC m=+0.647005671 container cleanup 56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:03:09 np0005474864 podman[192538]: nova_compute
Oct  7 16:03:09 np0005474864 podman[192565]: nova_compute
Oct  7 16:03:09 np0005474864 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Oct  7 16:03:09 np0005474864 systemd[1]: Stopped nova_compute container.
Oct  7 16:03:09 np0005474864 systemd[1]: Starting nova_compute container...
Oct  7 16:03:09 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:03:09 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:09 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:09 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:09 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:09 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c795aed9e99a77d36679c30bccdbe12dd7a604af170dc09ecab03ded25b3ea5/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:09 np0005474864 podman[192578]: 2025-10-07 20:03:09.636801753 +0000 UTC m=+0.134232325 container init 56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute, io.buildah.version=1.41.3)
Oct  7 16:03:09 np0005474864 podman[192578]: 2025-10-07 20:03:09.645931297 +0000 UTC m=+0.143361849 container start 56e5458d48f17d65bd69097440dbe09e54ba1addc24caa08d0339d7c735ae8da (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=nova_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, config_id=edpm)
Oct  7 16:03:09 np0005474864 podman[192578]: nova_compute
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + sudo -E kolla_set_configs
Oct  7 16:03:09 np0005474864 systemd[1]: Started nova_compute container.
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Validating config file
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying service configuration files
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /etc/nova/nova.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /etc/ceph
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Creating directory /etc/ceph
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /etc/ceph
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Writing out command to execute
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:09 np0005474864 nova_compute[192593]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Oct  7 16:03:09 np0005474864 nova_compute[192593]: ++ cat /run_command
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + CMD=nova-compute
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + ARGS=
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + sudo kolla_copy_cacerts
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + [[ ! -n '' ]]
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + . kolla_extend_start
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + echo 'Running command: '\''nova-compute'\'''
Oct  7 16:03:09 np0005474864 nova_compute[192593]: Running command: 'nova-compute'
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + umask 0022
Oct  7 16:03:09 np0005474864 nova_compute[192593]: + exec nova-compute
Oct  7 16:03:10 np0005474864 python3.9[192757]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Oct  7 16:03:11 np0005474864 systemd[1]: Started libpod-conmon-902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef.scope.
Oct  7 16:03:11 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:03:11 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5f0a0e5b454c3b258c96a2c069858ccf7c0c82c7dcc2affd6f0c301a0d75c38/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:11 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5f0a0e5b454c3b258c96a2c069858ccf7c0c82c7dcc2affd6f0c301a0d75c38/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:11 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c5f0a0e5b454c3b258c96a2c069858ccf7c0c82c7dcc2affd6f0c301a0d75c38/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Oct  7 16:03:11 np0005474864 podman[192782]: 2025-10-07 20:03:11.27284815 +0000 UTC m=+0.195206657 container init 902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, config_id=edpm, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  7 16:03:11 np0005474864 podman[192782]: 2025-10-07 20:03:11.286612361 +0000 UTC m=+0.208970848 container start 902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:03:11 np0005474864 python3.9[192757]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Applying nova statedir ownership
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Oct  7 16:03:11 np0005474864 nova_compute_init[192803]: INFO:nova_statedir:Nova statedir ownership complete
Oct  7 16:03:11 np0005474864 systemd[1]: libpod-902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef.scope: Deactivated successfully.
Oct  7 16:03:11 np0005474864 podman[192804]: 2025-10-07 20:03:11.367443414 +0000 UTC m=+0.044837645 container died 902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=edpm)
Oct  7 16:03:11 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef-userdata-shm.mount: Deactivated successfully.
Oct  7 16:03:11 np0005474864 systemd[1]: var-lib-containers-storage-overlay-c5f0a0e5b454c3b258c96a2c069858ccf7c0c82c7dcc2affd6f0c301a0d75c38-merged.mount: Deactivated successfully.
Oct  7 16:03:11 np0005474864 podman[192816]: 2025-10-07 20:03:11.437205249 +0000 UTC m=+0.065190989 container cleanup 902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, tcib_managed=true, container_name=nova_compute_init, org.label-schema.vendor=CentOS, config_id=edpm)
Oct  7 16:03:11 np0005474864 systemd[1]: libpod-conmon-902c3d15cf8da79783d1c46d2ea9a08f5e156e92a14414eb6f989af3ff5557ef.scope: Deactivated successfully.
Oct  7 16:03:11 np0005474864 nova_compute[192593]: 2025-10-07 20:03:11.890 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  7 16:03:11 np0005474864 nova_compute[192593]: 2025-10-07 20:03:11.890 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  7 16:03:11 np0005474864 nova_compute[192593]: 2025-10-07 20:03:11.891 2 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Oct  7 16:03:11 np0005474864 nova_compute[192593]: 2025-10-07 20:03:11.891 2 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.047 2 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.081 2 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:03:12 np0005474864 systemd[1]: session-26.scope: Deactivated successfully.
Oct  7 16:03:12 np0005474864 systemd[1]: session-26.scope: Consumed 2min 42.505s CPU time.
Oct  7 16:03:12 np0005474864 systemd-logind[805]: Session 26 logged out. Waiting for processes to exit.
Oct  7 16:03:12 np0005474864 systemd-logind[805]: Removed session 26.
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.559 2 INFO nova.virt.driver [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.687 2 INFO nova.compute.provider_config [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.704 2 DEBUG oslo_concurrency.lockutils [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.705 2 DEBUG oslo_concurrency.lockutils [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.705 2 DEBUG oslo_concurrency.lockutils [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.705 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.705 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.706 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.706 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.706 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.706 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.706 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.706 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.706 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.707 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.707 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.707 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.707 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.707 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.707 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.707 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.708 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.708 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.708 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.708 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.708 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.708 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.708 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.709 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.709 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.709 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.709 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.709 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.709 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.710 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.710 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.710 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.710 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.710 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.710 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.710 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.711 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.711 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.711 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.711 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.711 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.711 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.711 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.712 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.712 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.712 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.712 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.712 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.712 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.713 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.713 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.713 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.713 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.713 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.713 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.713 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.714 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.715 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.715 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.715 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.715 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.715 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.715 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.715 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.716 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.716 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.716 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.716 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.716 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.716 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.716 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.717 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.717 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.717 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.717 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.717 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.717 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.717 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.718 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.718 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.718 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.718 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.718 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.718 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.718 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.719 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.720 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.720 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.720 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.720 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.720 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.720 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.720 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.721 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.721 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.721 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.721 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.721 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.721 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.721 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.722 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.722 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.722 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.722 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.722 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.722 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.722 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.723 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.723 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.723 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.723 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.723 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.723 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.723 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.724 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.724 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.724 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.724 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.724 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.724 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.724 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.725 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.726 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.726 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.726 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.726 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.726 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.726 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.727 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.727 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.727 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.727 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.727 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.727 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.728 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.728 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.728 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.728 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.728 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.728 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.728 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.729 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.729 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.729 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.729 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.729 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.729 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.729 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.730 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.730 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.730 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.730 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.730 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.730 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.731 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.731 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.731 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.731 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.731 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.731 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.731 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.732 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.732 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.732 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.732 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.732 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.732 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.732 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.733 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.733 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.733 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.733 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.733 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.733 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.733 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.734 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.735 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.735 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.735 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.735 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.735 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.735 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.735 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.736 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.736 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.736 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.736 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.736 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.736 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.737 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.737 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.737 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.737 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.737 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.737 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.737 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.738 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.738 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.738 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.738 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.738 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.738 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.738 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.739 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.739 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.739 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.739 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.739 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.739 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.739 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.740 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.740 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.740 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.740 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.740 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.740 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.741 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.742 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.742 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.742 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.742 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.742 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.742 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.742 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.743 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.743 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.743 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.743 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.743 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.743 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.744 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.744 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.744 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.744 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.744 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.744 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.744 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.745 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.745 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.745 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.745 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.745 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.745 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.746 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.746 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.746 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.746 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.746 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.746 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.746 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.747 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.747 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.747 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.747 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.747 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.747 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.747 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.748 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.748 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.748 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.748 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.748 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.748 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.748 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.749 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.749 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.749 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.749 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.749 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.749 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.749 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.750 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.750 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.750 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.750 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.750 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.750 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.750 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.751 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.751 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.751 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.751 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.751 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.751 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.751 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.752 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.752 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.752 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.752 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.752 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.752 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.752 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.753 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.753 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.753 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.753 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.753 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.753 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.754 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.754 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.754 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.754 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.754 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.755 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.755 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.755 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.755 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.755 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.755 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.756 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.756 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.756 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.756 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.756 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.756 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.757 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.757 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.757 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.757 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.757 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.758 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.758 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.758 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.758 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.758 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.758 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.759 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.759 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.759 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.759 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.759 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.759 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.760 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.760 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.761 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.761 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.762 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.762 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.762 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.762 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.762 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.763 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.763 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.763 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.763 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.763 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.763 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.764 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.764 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.764 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.764 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.764 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.765 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.765 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.765 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.765 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.765 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.766 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.766 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.766 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.766 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.766 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.766 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.767 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.767 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.767 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.767 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.767 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.767 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.768 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.768 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.768 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.768 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.768 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.769 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.769 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.769 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.769 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.769 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.769 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.770 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.770 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.770 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.770 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.770 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.771 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.771 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.771 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.771 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.771 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.771 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.772 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.772 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.772 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.772 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.772 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.772 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.773 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.773 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.773 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.773 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.773 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.774 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.774 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.774 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.774 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.774 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.774 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.775 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.775 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.775 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.775 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.775 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.776 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.776 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.776 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.776 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.776 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.777 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.777 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.777 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.777 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.777 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.777 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.778 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.778 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.778 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.778 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.778 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.779 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.779 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.779 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.779 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.779 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.779 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.780 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.780 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.780 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.780 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.780 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.781 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.781 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.781 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.781 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.781 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.781 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.782 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.782 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.782 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.782 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.782 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.783 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.783 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.783 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.783 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.783 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.784 2 WARNING oslo_config.cfg [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Oct  7 16:03:12 np0005474864 nova_compute[192593]: live_migration_uri is deprecated for removal in favor of two other options that
Oct  7 16:03:12 np0005474864 nova_compute[192593]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Oct  7 16:03:12 np0005474864 nova_compute[192593]: and ``live_migration_inbound_addr`` respectively.
Oct  7 16:03:12 np0005474864 nova_compute[192593]: ).  Its value may be silently ignored in the future.#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.784 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.784 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.784 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.784 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.785 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.785 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.785 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.785 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.785 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.786 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.786 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.786 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.786 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.786 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.786 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.787 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.787 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.787 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.787 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.788 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.788 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.788 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.788 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.788 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.789 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.789 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.789 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.789 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.790 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.790 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.790 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.790 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.790 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.791 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.791 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.791 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.791 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.791 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.792 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.792 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.792 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.792 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.792 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.792 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.792 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.793 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.794 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.794 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.794 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.794 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.794 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.794 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.794 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.795 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.796 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.796 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.796 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.796 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.796 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.796 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.796 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.797 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.798 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.798 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.798 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.798 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.798 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.798 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.798 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.799 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.800 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.801 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.801 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.801 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.801 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.801 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.801 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.801 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.802 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.803 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.804 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.804 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.804 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.804 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.804 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.804 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.804 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.805 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.805 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.805 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.805 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.805 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.805 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.805 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.806 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.806 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.806 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.806 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.806 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.806 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.806 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.807 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.808 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.808 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.808 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.808 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.808 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.808 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.808 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.809 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.809 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.809 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.809 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.809 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.809 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.809 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.810 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.810 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.810 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.810 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.810 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.810 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.810 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.811 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.811 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.811 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.811 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.811 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.811 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.811 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.812 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.813 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.813 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.813 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.813 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.813 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.813 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.814 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.815 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.815 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.815 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.815 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.815 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.815 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.815 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.816 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.816 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.816 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.816 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.816 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.816 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.816 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.817 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.818 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.818 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.818 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.818 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.818 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.818 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.818 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.819 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.819 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.819 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.819 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.819 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.819 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.819 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.820 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.820 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.820 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.820 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.820 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.820 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.821 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.821 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.821 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.821 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.821 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.821 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.821 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.822 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.822 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.822 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.822 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.822 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.822 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.822 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.823 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.824 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.824 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.824 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.824 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.824 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.824 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.824 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.825 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.825 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.825 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.825 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.825 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.825 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.825 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.826 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.826 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.826 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.826 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.826 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.826 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.826 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.827 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.828 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.828 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.828 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.828 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.828 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.828 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.828 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.829 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.829 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.829 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.829 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.829 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.829 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.829 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.830 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.831 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.831 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.831 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.831 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.831 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.831 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.831 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.832 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.833 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.833 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.833 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.833 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.833 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.833 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.833 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.834 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.834 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.834 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.834 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.834 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.834 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.834 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.835 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.836 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.836 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.836 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.836 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.836 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.836 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.836 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.837 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.838 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.838 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.838 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.838 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.838 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.838 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.838 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.839 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.840 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.840 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.840 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.840 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.840 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.840 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.840 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.841 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.842 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.842 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.842 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.842 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.842 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.842 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.842 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.843 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.844 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.844 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.844 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.844 2 DEBUG oslo_service.service [None req-f8c422f5-b685-4822-9a61-855ad7fba7a0 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.845 2 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.860 2 INFO nova.virt.node [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Determined node identity 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from /var/lib/nova/compute_id#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.861 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.862 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.862 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.862 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.875 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f7bd2d2fa30> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.879 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f7bd2d2fa30> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.880 2 INFO nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Connection event '1' reason 'None'#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.891 2 INFO nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Libvirt host capabilities <capabilities>
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <host>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <uuid>2e9c4e5f-0506-4565-be4b-95bb9b08ebdc</uuid>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <cpu>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <arch>x86_64</arch>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model>EPYC-Rome-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <vendor>AMD</vendor>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <microcode version='16777317'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <signature family='23' model='49' stepping='0'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <maxphysaddr mode='emulate' bits='40'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='x2apic'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='tsc-deadline'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='osxsave'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='hypervisor'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='tsc_adjust'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='spec-ctrl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='stibp'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='arch-capabilities'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='cmp_legacy'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='topoext'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='virt-ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='lbrv'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='tsc-scale'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='vmcb-clean'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='pause-filter'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='pfthreshold'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='svme-addr-chk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='rdctl-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='skip-l1dfl-vmentry'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='mds-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature name='pschange-mc-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <pages unit='KiB' size='4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <pages unit='KiB' size='2048'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <pages unit='KiB' size='1048576'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </cpu>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <power_management>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <suspend_mem/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <suspend_disk/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <suspend_hybrid/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </power_management>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <iommu support='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <migration_features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <live/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <uri_transports>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <uri_transport>tcp</uri_transport>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <uri_transport>rdma</uri_transport>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </uri_transports>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </migration_features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <topology>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <cells num='1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <cell id='0'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          <memory unit='KiB'>7864096</memory>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          <pages unit='KiB' size='4'>1966024</pages>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          <pages unit='KiB' size='2048'>0</pages>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          <pages unit='KiB' size='1048576'>0</pages>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          <distances>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <sibling id='0' value='10'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          </distances>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          <cpus num='8'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:          </cpus>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        </cell>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </cells>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </topology>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <cache>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </cache>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <secmodel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model>selinux</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <doi>0</doi>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </secmodel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <secmodel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model>dac</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <doi>0</doi>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <baselabel type='kvm'>+107:+107</baselabel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <baselabel type='qemu'>+107:+107</baselabel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </secmodel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </host>
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <guest>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <os_type>hvm</os_type>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <arch name='i686'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <wordsize>32</wordsize>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <domain type='qemu'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <domain type='kvm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </arch>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <pae/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <nonpae/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <acpi default='on' toggle='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <apic default='on' toggle='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <cpuselection/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <deviceboot/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <disksnapshot default='on' toggle='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <externalSnapshot/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </guest>
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <guest>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <os_type>hvm</os_type>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <arch name='x86_64'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <wordsize>64</wordsize>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine canonical='pc-q35-rhel9.6.0' maxCpus='4096'>q35</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <domain type='qemu'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <domain type='kvm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </arch>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <acpi default='on' toggle='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <apic default='on' toggle='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <cpuselection/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <deviceboot/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <disksnapshot default='on' toggle='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <externalSnapshot/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </guest>
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 
Oct  7 16:03:12 np0005474864 nova_compute[192593]: </capabilities>
Oct  7 16:03:12 np0005474864 nova_compute[192593]: #033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.894 2 DEBUG nova.virt.libvirt.volume.mount [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.900 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.904 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Oct  7 16:03:12 np0005474864 nova_compute[192593]: <domainCapabilities>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <domain>kvm</domain>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <arch>i686</arch>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <vcpu max='4096'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <iothreads supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <os supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <enum name='firmware'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <loader supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>rom</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>pflash</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='readonly'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>yes</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='secure'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </loader>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <cpu>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='maximumMigratable'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <vendor>AMD</vendor>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='succor'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='custom' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx10'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx10-128'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx10-256'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx10-512'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='SierraForest'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Snowridge'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='athlon'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='athlon-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='core2duo'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='core2duo-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='coreduo'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='coreduo-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='n270'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='n270-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='phenom'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='phenom-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <memoryBacking supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <enum name='sourceType'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <value>file</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <value>anonymous</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <value>memfd</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </memoryBacking>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <disk supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='diskDevice'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>disk</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>cdrom</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>floppy</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>lun</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>fdc</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>sata</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <graphics supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>vnc</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>egl-headless</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>dbus</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <video supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='modelType'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>vga</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>cirrus</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>none</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>bochs</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>ramfb</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <hostdev supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='mode'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>subsystem</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='startupPolicy'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>mandatory</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>requisite</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>optional</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='subsysType'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>pci</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='capsType'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='pciBackend'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </hostdev>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <rng supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>random</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>egd</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <filesystem supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='driverType'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>path</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>handle</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>virtiofs</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </filesystem>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <tpm supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>tpm-tis</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>tpm-crb</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>emulator</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>external</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='backendVersion'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>2.0</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </tpm>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <redirdev supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </redirdev>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <channel supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>pty</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>unix</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </channel>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <crypto supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='model'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>qemu</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </crypto>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <interface supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='backendType'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>passt</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <panic supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>isa</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>hyperv</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </panic>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <gic supported='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <genid supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <backup supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <async-teardown supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <ps2 supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <sev supported='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <sgx supported='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <hyperv supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='features'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>relaxed</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>vapic</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>spinlocks</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>vpindex</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>runtime</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>synic</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>stimer</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>reset</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>vendor_id</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>frequencies</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>reenlightenment</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>tlbflush</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>ipi</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>avic</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>emsr_bitmap</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>xmm_input</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </hyperv>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <launchSecurity supported='no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:03:12 np0005474864 nova_compute[192593]: </domainCapabilities>
Oct  7 16:03:12 np0005474864 nova_compute[192593]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:12 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.911 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Oct  7 16:03:12 np0005474864 nova_compute[192593]: <domainCapabilities>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <domain>kvm</domain>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <arch>i686</arch>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <vcpu max='240'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <iothreads supported='yes'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <os supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <enum name='firmware'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <loader supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>rom</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>pflash</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='readonly'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>yes</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='secure'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </loader>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:  <cpu>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <enum name='maximumMigratable'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <vendor>AMD</vendor>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='succor'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:    <mode name='custom' supported='yes'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v1'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v3'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:12 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-128'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-256'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-512'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SierraForest'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='athlon'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='athlon-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='core2duo'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='core2duo-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='coreduo'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='coreduo-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='n270'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='n270-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='phenom'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='phenom-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <memoryBacking supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <enum name='sourceType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>file</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>anonymous</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>memfd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </memoryBacking>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <disk supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='diskDevice'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>disk</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>cdrom</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>floppy</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>lun</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ide</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>fdc</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>sata</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <graphics supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vnc</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>egl-headless</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>dbus</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <video supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='modelType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vga</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>cirrus</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>none</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>bochs</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ramfb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <hostdev supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='mode'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>subsystem</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='startupPolicy'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>mandatory</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>requisite</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>optional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='subsysType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pci</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='capsType'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='pciBackend'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </hostdev>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <rng supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>random</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>egd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <filesystem supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='driverType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>path</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>handle</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtiofs</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </filesystem>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <tpm supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tpm-tis</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tpm-crb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>emulator</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>external</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendVersion'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>2.0</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </tpm>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <redirdev supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </redirdev>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <channel supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pty</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>unix</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </channel>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <crypto supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>qemu</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </crypto>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <interface supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>passt</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <panic supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>isa</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>hyperv</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </panic>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <gic supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <genid supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <backup supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <async-teardown supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <ps2 supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <sev supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <sgx supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <hyperv supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='features'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>relaxed</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vapic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>spinlocks</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vpindex</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>runtime</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>synic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>stimer</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>reset</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vendor_id</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>frequencies</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>reenlightenment</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tlbflush</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ipi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>avic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>emsr_bitmap</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>xmm_input</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </hyperv>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <launchSecurity supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: </domainCapabilities>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.952 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:12.956 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Oct  7 16:03:13 np0005474864 nova_compute[192593]: <domainCapabilities>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <domain>kvm</domain>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <machine>pc-q35-rhel9.6.0</machine>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <arch>x86_64</arch>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <vcpu max='4096'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <iothreads supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <os supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <enum name='firmware'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>efi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <loader supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>rom</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pflash</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='readonly'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>yes</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='secure'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>yes</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </loader>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='maximumMigratable'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <vendor>AMD</vendor>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='succor'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='custom' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-128'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-256'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-512'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SierraForest'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='athlon'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='athlon-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='core2duo'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='core2duo-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='coreduo'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='coreduo-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='n270'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='n270-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='phenom'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='phenom-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <memoryBacking supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <enum name='sourceType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>file</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>anonymous</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>memfd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </memoryBacking>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <disk supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='diskDevice'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>disk</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>cdrom</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>floppy</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>lun</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>fdc</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>sata</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <graphics supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vnc</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>egl-headless</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>dbus</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <video supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='modelType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vga</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>cirrus</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>none</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>bochs</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ramfb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <hostdev supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='mode'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>subsystem</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='startupPolicy'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>mandatory</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>requisite</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>optional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='subsysType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pci</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='capsType'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='pciBackend'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </hostdev>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <rng supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>random</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>egd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <filesystem supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='driverType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>path</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>handle</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtiofs</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </filesystem>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <tpm supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tpm-tis</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tpm-crb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>emulator</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>external</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendVersion'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>2.0</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </tpm>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <redirdev supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </redirdev>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <channel supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pty</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>unix</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </channel>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <crypto supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>qemu</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </crypto>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <interface supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>passt</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <panic supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>isa</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>hyperv</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </panic>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <gic supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <genid supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <backup supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <async-teardown supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <ps2 supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <sev supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <sgx supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <hyperv supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='features'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>relaxed</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vapic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>spinlocks</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vpindex</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>runtime</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>synic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>stimer</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>reset</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vendor_id</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>frequencies</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>reenlightenment</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tlbflush</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ipi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>avic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>emsr_bitmap</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>xmm_input</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </hyperv>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <launchSecurity supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: </domainCapabilities>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.025 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Oct  7 16:03:13 np0005474864 nova_compute[192593]: <domainCapabilities>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <path>/usr/libexec/qemu-kvm</path>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <domain>kvm</domain>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <machine>pc-i440fx-rhel7.6.0</machine>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <arch>x86_64</arch>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <vcpu max='240'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <iothreads supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <os supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <enum name='firmware'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <loader supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>rom</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pflash</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='readonly'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>yes</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='secure'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>no</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </loader>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='host-passthrough' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='hostPassthroughMigratable'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='maximum' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='maximumMigratable'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>on</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>off</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='host-model' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model fallback='forbid'>EPYC-Rome</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <vendor>AMD</vendor>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <maxphysaddr mode='passthrough' limit='40'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='x2apic'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-deadline'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='hypervisor'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc_adjust'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='spec-ctrl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='stibp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='arch-capabilities'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='ssbd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='cmp_legacy'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='overflow-recov'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='succor'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='ibrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='amd-ssbd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='virt-ssbd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='lbrv'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='tsc-scale'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='vmcb-clean'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='flushbyasid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='pause-filter'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='pfthreshold'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='svme-addr-chk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='lfence-always-serializing'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='rdctl-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='skip-l1dfl-vmentry'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='mds-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='pschange-mc-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='gds-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='require' name='rfds-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <feature policy='disable' name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <mode name='custom' supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Broadwell-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cascadelake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Cooperlake-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Denverton-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Dhyana-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Genoa-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='auto-ibrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Milan-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amd-psfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='no-nested-data-bp'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='null-sel-clr-base'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='stibp-always-on'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-Rome-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='EPYC-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='GraniteRapids-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-128'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-256'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx10-512'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='prefetchiti'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Haswell-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-noTSX'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v6'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Icelake-Server-v7'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='IvyBridge-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='KnightsMill-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4fmaps'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-4vnniw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512er'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512pf'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G4-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Opteron_G5-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fma4'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tbm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xop'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SapphireRapids-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='amx-tile'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-bf16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-fp16'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512-vpopcntdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bitalg'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vbmi2'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrc'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fzrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='la57'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='taa-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='tsx-ldtrk'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xfd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SierraForest'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='SierraForest-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ifma'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-ne-convert'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx-vnni-int8'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='bus-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cmpccxadd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fbsdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='fsrs'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ibrs-all'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mcdt-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pbrsb-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='psdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='sbdr-ssdp-no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='serialize'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vaes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='vpclmulqdq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Client-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='hle'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='rtm'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Skylake-Server-v5'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512bw'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512cd'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512dq'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512f'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='avx512vl'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='invpcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pcid'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='pku'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='mpx'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v2'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v3'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='core-capability'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='split-lock-detect'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='Snowridge-v4'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='cldemote'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='erms'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='gfni'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdir64b'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='movdiri'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='xsaves'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='athlon'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='athlon-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='core2duo'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='core2duo-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='coreduo'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='coreduo-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='n270'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='n270-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='ss'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='phenom'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <blockers model='phenom-v1'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnow'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <feature name='3dnowext'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </blockers>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </mode>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <memoryBacking supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <enum name='sourceType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>file</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>anonymous</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <value>memfd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </memoryBacking>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <disk supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='diskDevice'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>disk</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>cdrom</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>floppy</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>lun</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ide</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>fdc</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>sata</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <graphics supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vnc</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>egl-headless</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>dbus</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <video supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='modelType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vga</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>cirrus</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>none</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>bochs</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ramfb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <hostdev supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='mode'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>subsystem</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='startupPolicy'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>mandatory</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>requisite</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>optional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='subsysType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pci</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>scsi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='capsType'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='pciBackend'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </hostdev>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <rng supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtio-non-transitional</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>random</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>egd</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <filesystem supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='driverType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>path</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>handle</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>virtiofs</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </filesystem>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <tpm supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tpm-tis</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tpm-crb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>emulator</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>external</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendVersion'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>2.0</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </tpm>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <redirdev supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='bus'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>usb</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </redirdev>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <channel supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>pty</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>unix</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </channel>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <crypto supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='type'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>qemu</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendModel'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>builtin</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </crypto>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <interface supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='backendType'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>default</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>passt</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <panic supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='model'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>isa</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>hyperv</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </panic>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <gic supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <vmcoreinfo supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <genid supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <backingStoreInput supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <backup supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <async-teardown supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <ps2 supported='yes'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <sev supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <sgx supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <hyperv supported='yes'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      <enum name='features'>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>relaxed</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vapic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>spinlocks</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vpindex</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>runtime</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>synic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>stimer</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>reset</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>vendor_id</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>frequencies</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>reenlightenment</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>tlbflush</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>ipi</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>avic</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>emsr_bitmap</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:        <value>xmm_input</value>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:      </enum>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    </hyperv>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:    <launchSecurity supported='no'/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: </domainCapabilities>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.096 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.096 2 INFO nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Secure Boot support detected#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.099 2 INFO nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.099 2 INFO nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.114 2 DEBUG nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] cpu compare xml: <cpu match="exact">
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <model>Nehalem</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: </cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.118 2 DEBUG nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.163 2 INFO nova.virt.node [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Determined node identity 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from /var/lib/nova/compute_id#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.190 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Verified node 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.225 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.304 2 DEBUG oslo_concurrency.lockutils [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.305 2 DEBUG oslo_concurrency.lockutils [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.305 2 DEBUG oslo_concurrency.lockutils [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.306 2 DEBUG nova.compute.resource_tracker [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.515 2 WARNING nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.517 2 DEBUG nova.compute.resource_tracker [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6194MB free_disk=73.66986846923828GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.517 2 DEBUG oslo_concurrency.lockutils [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.517 2 DEBUG oslo_concurrency.lockutils [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.677 2 DEBUG nova.compute.resource_tracker [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.678 2 DEBUG nova.compute.resource_tracker [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.757 2 DEBUG nova.scheduler.client.report [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Refreshing inventories for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.791 2 DEBUG nova.scheduler.client.report [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Updating ProviderTree inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.792 2 DEBUG nova.compute.provider_tree [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.812 2 DEBUG nova.scheduler.client.report [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Refreshing aggregate associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.842 2 DEBUG nova.scheduler.client.report [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Refreshing trait associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.884 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Oct  7 16:03:13 np0005474864 nova_compute[192593]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.885 2 INFO nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] kernel doesn't support AMD SEV#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.886 2 DEBUG nova.compute.provider_tree [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.887 2 DEBUG nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.890 2 DEBUG nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Libvirt baseline CPU <cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <arch>x86_64</arch>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <model>Nehalem</model>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <vendor>AMD</vendor>
Oct  7 16:03:13 np0005474864 nova_compute[192593]:  <topology sockets="8" cores="1" threads="1"/>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: </cpu>
Oct  7 16:03:13 np0005474864 nova_compute[192593]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.926 2 DEBUG nova.scheduler.client.report [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.962 2 DEBUG nova.compute.resource_tracker [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.962 2 DEBUG oslo_concurrency.lockutils [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.445s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:03:13 np0005474864 nova_compute[192593]: 2025-10-07 20:03:13.963 2 DEBUG nova.service [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Oct  7 16:03:14 np0005474864 nova_compute[192593]: 2025-10-07 20:03:14.011 2 DEBUG nova.service [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Oct  7 16:03:14 np0005474864 nova_compute[192593]: 2025-10-07 20:03:14.012 2 DEBUG nova.servicegroup.drivers.db [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Oct  7 16:03:16 np0005474864 nova_compute[192593]: 2025-10-07 20:03:16.013 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:03:16 np0005474864 nova_compute[192593]: 2025-10-07 20:03:16.051 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:03:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:03:16.172 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:03:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:03:16.173 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:03:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:03:16.173 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:03:17 np0005474864 podman[192895]: 2025-10-07 20:03:17.420651589 +0000 UTC m=+0.108238791 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:03:17 np0005474864 podman[192894]: 2025-10-07 20:03:17.446000954 +0000 UTC m=+0.130896688 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  7 16:03:17 np0005474864 systemd-logind[805]: New session 29 of user zuul.
Oct  7 16:03:17 np0005474864 systemd[1]: Started Session 29 of User zuul.
Oct  7 16:03:18 np0005474864 podman[193061]: 2025-10-07 20:03:18.586699365 +0000 UTC m=+0.131784345 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  7 16:03:18 np0005474864 python3.9[193100]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Oct  7 16:03:20 np0005474864 python3.9[193269]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:03:20 np0005474864 systemd[1]: Reloading.
Oct  7 16:03:20 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:03:20 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:03:21 np0005474864 python3.9[193453]: ansible-ansible.builtin.service_facts Invoked
Oct  7 16:03:21 np0005474864 network[193470]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Oct  7 16:03:21 np0005474864 network[193471]: 'network-scripts' will be removed from distribution in near future.
Oct  7 16:03:21 np0005474864 network[193472]: It is advised to switch to 'NetworkManager' instead for network management.
Oct  7 16:03:23 np0005474864 podman[193478]: 2025-10-07 20:03:23.403739323 +0000 UTC m=+0.092538786 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:03:28 np0005474864 python3.9[193768]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:03:30 np0005474864 python3.9[193921]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:30 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 16:03:30 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 16:03:31 np0005474864 python3.9[194074]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:32 np0005474864 python3.9[194226]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:03:33 np0005474864 python3.9[194378]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  7 16:03:34 np0005474864 python3.9[194530]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:03:34 np0005474864 systemd[1]: Reloading.
Oct  7 16:03:34 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:03:34 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:03:35 np0005474864 python3.9[194717]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:03:36 np0005474864 python3.9[194870]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:03:37 np0005474864 python3.9[195020]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:03:38 np0005474864 python3.9[195172]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:39 np0005474864 python3.9[195293]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867418.0720062-361-108563083422853/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=d3d36c542f4af449a66988015465dd0bb4b47bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:03:40 np0005474864 python3.9[195445]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Oct  7 16:03:41 np0005474864 python3.9[195597]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Oct  7 16:03:42 np0005474864 python3.9[195750]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Oct  7 16:03:43 np0005474864 auditd[707]: Audit daemon rotating log files
Oct  7 16:03:43 np0005474864 python3.9[195908]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Oct  7 16:03:45 np0005474864 python3.9[196066]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:46 np0005474864 python3.9[196187]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759867424.8482268-566-29734992298410/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:46 np0005474864 python3.9[196337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:47 np0005474864 python3.9[196458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759867426.2980413-566-132558657488971/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:47 np0005474864 podman[196459]: 2025-10-07 20:03:47.655154063 +0000 UTC m=+0.095114691 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  7 16:03:47 np0005474864 podman[196460]: 2025-10-07 20:03:47.661680432 +0000 UTC m=+0.096146411 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:03:48 np0005474864 python3.9[196644]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:48 np0005474864 podman[196739]: 2025-10-07 20:03:48.951544221 +0000 UTC m=+0.144520275 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  7 16:03:49 np0005474864 python3.9[196778]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1759867427.7655187-566-191789030115595/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:49 np0005474864 python3.9[196942]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:03:50 np0005474864 python3.9[197094]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:03:51 np0005474864 python3.9[197246]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:52 np0005474864 python3.9[197367]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867431.1455765-743-14502884799972/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:53 np0005474864 python3.9[197517]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:53 np0005474864 podman[197567]: 2025-10-07 20:03:53.515025611 +0000 UTC m=+0.061267639 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent)
Oct  7 16:03:53 np0005474864 python3.9[197610]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:54 np0005474864 python3.9[197763]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:55 np0005474864 python3.9[197884]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867433.927854-743-97626501447781/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:56 np0005474864 python3.9[198034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:56 np0005474864 python3.9[198155]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867435.5676205-743-145954728865916/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:57 np0005474864 python3.9[198305]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:03:58 np0005474864 python3.9[198426]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867437.1377828-743-80995897692900/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:03:59 np0005474864 python3.9[198576]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:00 np0005474864 python3.9[198697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867438.8391979-743-6824906817163/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:00 np0005474864 python3.9[198847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:01 np0005474864 python3.9[198968]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867440.2532609-743-210179258936151/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:02 np0005474864 python3.9[199118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:03 np0005474864 python3.9[199239]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867441.7423875-743-266974296338960/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:03 np0005474864 python3.9[199389]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:04 np0005474864 python3.9[199510]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867443.2167637-743-7748479559310/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:05 np0005474864 python3.9[199660]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:05 np0005474864 python3.9[199781]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867444.6525242-743-63024796347389/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:06 np0005474864 python3.9[199931]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:07 np0005474864 python3.9[200052]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867446.0695317-743-16888777127939/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:08 np0005474864 python3.9[200202]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:08 np0005474864 python3.9[200278]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:09 np0005474864 python3.9[200428]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:09 np0005474864 python3.9[200504]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:10 np0005474864 python3.9[200654]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:11 np0005474864 python3.9[200730]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.095 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.095 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.095 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.108 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.108 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.108 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.109 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.127 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.128 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.128 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.128 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:04:12 np0005474864 python3.9[200882]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.293 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.294 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6170MB free_disk=73.66828536987305GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.294 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.294 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.383 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.384 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.406 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.432 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.434 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:04:12 np0005474864 nova_compute[192593]: 2025-10-07 20:04:12.434 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:04:13 np0005474864 python3.9[201034]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:14 np0005474864 python3.9[201186]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:04:15 np0005474864 python3.9[201338]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:04:15 np0005474864 systemd[1]: Reloading.
Oct  7 16:04:15 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:04:15 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:04:15 np0005474864 systemd[1]: Listening on Podman API Socket.
Oct  7 16:04:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:04:16.173 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:04:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:04:16.173 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:04:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:04:16.173 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:04:16 np0005474864 python3.9[201529]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:17 np0005474864 python3.9[201652]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867455.8365505-1409-204517060830268/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:04:17 np0005474864 python3.9[201728]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:18 np0005474864 podman[201823]: 2025-10-07 20:04:18.047046903 +0000 UTC m=+0.058916621 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:04:18 np0005474864 podman[201824]: 2025-10-07 20:04:18.051908644 +0000 UTC m=+0.064741440 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3)
Oct  7 16:04:18 np0005474864 python3.9[201888]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867455.8365505-1409-204517060830268/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:04:19 np0005474864 podman[202015]: 2025-10-07 20:04:19.332215155 +0000 UTC m=+0.085054549 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:04:19 np0005474864 python3.9[202066]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Oct  7 16:04:20 np0005474864 python3.9[202221]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:04:21 np0005474864 python3[202373]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:04:22 np0005474864 podman[202410]: 2025-10-07 20:04:22.098668969 +0000 UTC m=+0.116316376 container create b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, tcib_managed=true)
Oct  7 16:04:22 np0005474864 podman[202410]: 2025-10-07 20:04:22.006413692 +0000 UTC m=+0.024061119 image pull 5397cd841d80292a5786d82cb8a2bcd574988efb08c605ba6eaaa59d6f646815 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Oct  7 16:04:22 np0005474864 python3[202373]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Oct  7 16:04:23 np0005474864 python3.9[202600]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:04:23 np0005474864 podman[202726]: 2025-10-07 20:04:23.923487682 +0000 UTC m=+0.087844691 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Oct  7 16:04:24 np0005474864 python3.9[202773]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:24 np0005474864 python3.9[202924]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759867464.1789427-1600-132269120889426/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:25 np0005474864 python3.9[203000]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:04:25 np0005474864 systemd[1]: Reloading.
Oct  7 16:04:25 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:04:25 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:04:26 np0005474864 python3.9[203110]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:04:26 np0005474864 systemd[1]: Reloading.
Oct  7 16:04:26 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:04:26 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:04:27 np0005474864 systemd[1]: Starting ceilometer_agent_compute container...
Oct  7 16:04:27 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:04:27 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:27 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:27 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:27 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:27 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.
Oct  7 16:04:27 np0005474864 podman[203151]: 2025-10-07 20:04:27.273013966 +0000 UTC m=+0.169107528 container init b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + sudo -E kolla_set_configs
Oct  7 16:04:27 np0005474864 podman[203151]: 2025-10-07 20:04:27.31000246 +0000 UTC m=+0.206096022 container start b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible)
Oct  7 16:04:27 np0005474864 podman[203151]: ceilometer_agent_compute
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: sudo: unable to send audit message: Operation not permitted
Oct  7 16:04:27 np0005474864 systemd[1]: Started ceilometer_agent_compute container.
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Validating config file
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Copying service configuration files
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: INFO:__main__:Writing out command to execute
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: ++ cat /run_command
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + ARGS=
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + sudo kolla_copy_cacerts
Oct  7 16:04:27 np0005474864 podman[203173]: 2025-10-07 20:04:27.392609377 +0000 UTC m=+0.072604288 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:04:27 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-7677eaecec8f79d0.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:04:27 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-7677eaecec8f79d0.service: Failed with result 'exit-code'.
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: sudo: unable to send audit message: Operation not permitted
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + [[ ! -n '' ]]
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + . kolla_extend_start
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + umask 0022
Oct  7 16:04:27 np0005474864 ceilometer_agent_compute[203166]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.268 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.268 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.268 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.269 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.270 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.271 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.272 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.273 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.274 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.275 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.276 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.277 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.278 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.279 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.280 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.281 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.282 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.304 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.305 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.306 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.385 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.456 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.456 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.456 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.456 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.456 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.457 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.458 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.459 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.460 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.461 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.462 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.463 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.464 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.465 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.466 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.467 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.468 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.469 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.470 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.471 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.472 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.473 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.474 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.478 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.485 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.490 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.491 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.492 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:28 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:28.493 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:29 np0005474864 python3.9[203356]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:04:29 np0005474864 systemd[1]: Stopping ceilometer_agent_compute container...
Oct  7 16:04:29 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:29.504 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Oct  7 16:04:29 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:29.605 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Oct  7 16:04:29 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:29.605 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Oct  7 16:04:29 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:29.605 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Oct  7 16:04:29 np0005474864 ceilometer_agent_compute[203166]: 2025-10-07 20:04:29.620 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Oct  7 16:04:29 np0005474864 virtqemud[192092]: End of file while reading data: Input/output error
Oct  7 16:04:29 np0005474864 virtqemud[192092]: End of file while reading data: Input/output error
Oct  7 16:04:29 np0005474864 systemd[1]: libpod-b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.scope: Deactivated successfully.
Oct  7 16:04:29 np0005474864 podman[203360]: 2025-10-07 20:04:29.812104154 +0000 UTC m=+0.363968643 container died b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  7 16:04:29 np0005474864 systemd[1]: libpod-b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.scope: Consumed 1.418s CPU time.
Oct  7 16:04:29 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-7677eaecec8f79d0.timer: Deactivated successfully.
Oct  7 16:04:29 np0005474864 systemd[1]: Stopped /usr/bin/podman healthcheck run b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.
Oct  7 16:04:29 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-userdata-shm.mount: Deactivated successfully.
Oct  7 16:04:29 np0005474864 systemd[1]: var-lib-containers-storage-overlay-7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0-merged.mount: Deactivated successfully.
Oct  7 16:04:29 np0005474864 podman[203360]: 2025-10-07 20:04:29.873021001 +0000 UTC m=+0.424885480 container cleanup b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  7 16:04:29 np0005474864 podman[203360]: ceilometer_agent_compute
Oct  7 16:04:29 np0005474864 podman[203390]: ceilometer_agent_compute
Oct  7 16:04:29 np0005474864 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Oct  7 16:04:29 np0005474864 systemd[1]: Stopped ceilometer_agent_compute container.
Oct  7 16:04:29 np0005474864 systemd[1]: Starting ceilometer_agent_compute container...
Oct  7 16:04:30 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:04:30 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:30 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:30 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:30 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7a3d760c021ab2cf678f4b1661a257ee23d407c51fb1f6a0143bf2202a2c3ef0/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:30 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.
Oct  7 16:04:30 np0005474864 podman[203403]: 2025-10-07 20:04:30.092026176 +0000 UTC m=+0.118472489 container init b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + sudo -E kolla_set_configs
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: sudo: unable to send audit message: Operation not permitted
Oct  7 16:04:30 np0005474864 podman[203403]: 2025-10-07 20:04:30.121556393 +0000 UTC m=+0.148002706 container start b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  7 16:04:30 np0005474864 podman[203403]: ceilometer_agent_compute
Oct  7 16:04:30 np0005474864 systemd[1]: Started ceilometer_agent_compute container.
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Validating config file
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Copying service configuration files
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: INFO:__main__:Writing out command to execute
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: ++ cat /run_command
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + ARGS=
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + sudo kolla_copy_cacerts
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: sudo: unable to send audit message: Operation not permitted
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + [[ ! -n '' ]]
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + . kolla_extend_start
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + umask 0022
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Oct  7 16:04:30 np0005474864 podman[203426]: 2025-10-07 20:04:30.216455457 +0000 UTC m=+0.074062070 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Oct  7 16:04:30 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:04:30 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Failed with result 'exit-code'.
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.994 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.994 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.995 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.996 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.997 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.998 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:30 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:30.999 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.000 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.001 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.002 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.003 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.004 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.005 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.006 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.007 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.008 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.009 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.010 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.011 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.011 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.030 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.032 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.033 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Oct  7 16:04:31 np0005474864 python3.9[203604]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.048 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.183 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.183 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.183 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.183 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.183 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.183 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.184 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.185 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.185 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.185 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.186 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.186 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.186 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.187 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.187 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.187 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.188 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.188 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.188 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.189 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.189 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.189 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.190 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.190 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.190 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.191 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.191 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.191 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.191 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.191 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.192 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.192 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.192 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.192 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.192 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.192 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.193 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.193 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.193 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.194 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.194 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.194 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.194 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.195 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.195 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.195 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.196 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.196 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.196 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.196 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.197 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.197 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.197 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.197 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.198 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.198 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.198 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.199 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.199 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.199 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.199 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.200 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.200 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.200 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.201 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.201 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.201 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.201 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.202 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.202 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.202 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.202 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.203 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.203 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.203 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.203 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.204 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.204 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.204 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.204 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.204 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.205 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.205 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.205 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.205 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.205 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.206 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.206 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.206 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.206 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.206 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.207 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.207 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.207 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.207 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.208 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.208 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.209 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.209 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.209 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.209 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.210 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.210 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.210 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.210 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.211 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.211 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.211 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.211 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.211 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.212 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.212 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.212 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.212 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.212 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.213 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.213 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.213 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.213 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.214 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.214 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.214 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.215 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.216 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.216 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.216 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.216 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.217 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.218 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.219 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.219 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.219 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.219 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.219 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.220 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.221 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.222 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.222 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.222 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.222 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.223 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.223 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.223 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.223 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.224 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.224 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.225 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.226 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.227 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.227 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.228 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.229 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.230 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.231 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.231 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.231 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.231 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.231 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.232 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.232 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.232 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.232 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.232 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.233 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.233 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.233 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.233 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.234 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.234 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.234 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.234 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.234 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.234 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.239 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.248 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:04:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:04:31 np0005474864 python3.9[203733]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867470.4271774-1697-178427403766011/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:04:32 np0005474864 python3.9[203885]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Oct  7 16:04:33 np0005474864 python3.9[204037]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:04:34 np0005474864 python3[204189]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:04:35 np0005474864 podman[204224]: 2025-10-07 20:04:35.115723721 +0000 UTC m=+0.067957663 container create 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:04:35 np0005474864 podman[204224]: 2025-10-07 20:04:35.078324015 +0000 UTC m=+0.030558037 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Oct  7 16:04:35 np0005474864 python3[204189]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Oct  7 16:04:36 np0005474864 python3.9[204415]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:04:37 np0005474864 python3.9[204569]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:38 np0005474864 python3.9[204720]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759867477.377277-1855-36205529573160/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:38 np0005474864 python3.9[204796]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:04:38 np0005474864 systemd[1]: Reloading.
Oct  7 16:04:38 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:04:38 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:04:39 np0005474864 python3.9[204907]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:04:39 np0005474864 systemd[1]: Reloading.
Oct  7 16:04:39 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:04:39 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:04:40 np0005474864 systemd[1]: Starting node_exporter container...
Oct  7 16:04:40 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:04:40 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2436c72640e2e4ca8473f3ca4141a344f12bc18ba3d62f2f2f99e52a3e6a1d0c/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:40 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2436c72640e2e4ca8473f3ca4141a344f12bc18ba3d62f2f2f99e52a3e6a1d0c/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:40 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.
Oct  7 16:04:40 np0005474864 podman[204946]: 2025-10-07 20:04:40.286532643 +0000 UTC m=+0.133841955 container init 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.307Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.307Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.307Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.308Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.308Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=arp
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=bcache
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=bonding
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=cpu
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=edac
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=filefd
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=netclass
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=netdev
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=netstat
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=nfs
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=nvme
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=softnet
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=systemd
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.309Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.310Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.310Z caller=node_exporter.go:117 level=info collector=xfs
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.310Z caller=node_exporter.go:117 level=info collector=zfs
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.310Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  7 16:04:40 np0005474864 node_exporter[204963]: ts=2025-10-07T20:04:40.311Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  7 16:04:40 np0005474864 podman[204946]: 2025-10-07 20:04:40.322663531 +0000 UTC m=+0.169972843 container start 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:04:40 np0005474864 podman[204946]: node_exporter
Oct  7 16:04:40 np0005474864 systemd[1]: Started node_exporter container.
Oct  7 16:04:40 np0005474864 podman[204972]: 2025-10-07 20:04:40.422140588 +0000 UTC m=+0.081047223 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:04:41 np0005474864 python3.9[205147]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:04:41 np0005474864 systemd[1]: Stopping node_exporter container...
Oct  7 16:04:41 np0005474864 systemd[1]: libpod-52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.scope: Deactivated successfully.
Oct  7 16:04:41 np0005474864 podman[205151]: 2025-10-07 20:04:41.665080425 +0000 UTC m=+0.055573773 container died 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:04:41 np0005474864 systemd[1]: 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815-c8d12beeb818c21.timer: Deactivated successfully.
Oct  7 16:04:41 np0005474864 systemd[1]: Stopped /usr/bin/podman healthcheck run 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.
Oct  7 16:04:41 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815-userdata-shm.mount: Deactivated successfully.
Oct  7 16:04:41 np0005474864 systemd[1]: var-lib-containers-storage-overlay-2436c72640e2e4ca8473f3ca4141a344f12bc18ba3d62f2f2f99e52a3e6a1d0c-merged.mount: Deactivated successfully.
Oct  7 16:04:41 np0005474864 podman[205151]: 2025-10-07 20:04:41.718537706 +0000 UTC m=+0.109031024 container cleanup 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:04:41 np0005474864 podman[205151]: node_exporter
Oct  7 16:04:41 np0005474864 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  7 16:04:41 np0005474864 podman[205180]: node_exporter
Oct  7 16:04:41 np0005474864 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Oct  7 16:04:41 np0005474864 systemd[1]: Stopped node_exporter container.
Oct  7 16:04:41 np0005474864 systemd[1]: Starting node_exporter container...
Oct  7 16:04:41 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:04:41 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2436c72640e2e4ca8473f3ca4141a344f12bc18ba3d62f2f2f99e52a3e6a1d0c/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:41 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2436c72640e2e4ca8473f3ca4141a344f12bc18ba3d62f2f2f99e52a3e6a1d0c/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:42 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.
Oct  7 16:04:42 np0005474864 podman[205193]: 2025-10-07 20:04:42.026868613 +0000 UTC m=+0.172078784 container init 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.046Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.046Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.046Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.047Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.047Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.048Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.048Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.048Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.048Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=arp
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=bcache
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=bonding
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=btrfs
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=conntrack
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=cpu
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=cpufreq
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=diskstats
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=edac
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=fibrechannel
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=filefd
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=filesystem
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=infiniband
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=ipvs
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=loadavg
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=mdadm
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=meminfo
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=netclass
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=netdev
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=netstat
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=nfs
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=nfsd
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=nvme
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=schedstat
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=sockstat
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=softnet
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=systemd
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=tapestats
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=udp_queues
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=vmstat
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=xfs
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.049Z caller=node_exporter.go:117 level=info collector=zfs
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.050Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Oct  7 16:04:42 np0005474864 node_exporter[205208]: ts=2025-10-07T20:04:42.050Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Oct  7 16:04:42 np0005474864 podman[205193]: 2025-10-07 20:04:42.067948675 +0000 UTC m=+0.213158786 container start 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:04:42 np0005474864 podman[205193]: node_exporter
Oct  7 16:04:42 np0005474864 systemd[1]: Started node_exporter container.
Oct  7 16:04:42 np0005474864 podman[205217]: 2025-10-07 20:04:42.173000924 +0000 UTC m=+0.089610132 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:04:42 np0005474864 python3.9[205393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:43 np0005474864 python3.9[205516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867482.4090703-1951-60272898026661/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:04:44 np0005474864 python3.9[205668]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Oct  7 16:04:45 np0005474864 python3.9[205820]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:04:47 np0005474864 python3[205972]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:04:48 np0005474864 podman[206030]: 2025-10-07 20:04:48.246000706 +0000 UTC m=+0.102488275 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:04:48 np0005474864 podman[206031]: 2025-10-07 20:04:48.436530714 +0000 UTC m=+0.286061871 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd)
Oct  7 16:04:48 np0005474864 podman[205986]: 2025-10-07 20:04:48.666188829 +0000 UTC m=+1.502157401 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  7 16:04:48 np0005474864 podman[206124]: 2025-10-07 20:04:48.804292286 +0000 UTC m=+0.043177544 container create b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:04:48 np0005474864 podman[206124]: 2025-10-07 20:04:48.782167694 +0000 UTC m=+0.021052972 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Oct  7 16:04:48 np0005474864 python3[205972]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Oct  7 16:04:49 np0005474864 podman[206287]: 2025-10-07 20:04:49.611501529 +0000 UTC m=+0.116669856 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  7 16:04:49 np0005474864 python3.9[206336]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:04:50 np0005474864 python3.9[206494]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:51 np0005474864 python3.9[206645]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759867490.9446654-2110-15423382749621/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:04:52 np0005474864 python3.9[206721]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:04:52 np0005474864 systemd[1]: Reloading.
Oct  7 16:04:52 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:04:52 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:04:53 np0005474864 python3.9[206833]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:04:53 np0005474864 systemd[1]: Reloading.
Oct  7 16:04:53 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:04:53 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:04:53 np0005474864 systemd[1]: Starting podman_exporter container...
Oct  7 16:04:53 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:04:53 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a85ff8dde1676d2149b7e9c7431dd723038c9303c0662012777780e7305602/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:53 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a85ff8dde1676d2149b7e9c7431dd723038c9303c0662012777780e7305602/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:53 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.
Oct  7 16:04:53 np0005474864 podman[206873]: 2025-10-07 20:04:53.973173923 +0000 UTC m=+0.196884884 container init b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:04:53 np0005474864 podman_exporter[206888]: ts=2025-10-07T20:04:53.990Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  7 16:04:53 np0005474864 podman_exporter[206888]: ts=2025-10-07T20:04:53.990Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  7 16:04:53 np0005474864 podman_exporter[206888]: ts=2025-10-07T20:04:53.990Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  7 16:04:53 np0005474864 podman_exporter[206888]: ts=2025-10-07T20:04:53.990Z caller=handler.go:105 level=info collector=container
Oct  7 16:04:53 np0005474864 podman[206873]: 2025-10-07 20:04:53.99620008 +0000 UTC m=+0.219910971 container start b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:04:54 np0005474864 podman[206873]: podman_exporter
Oct  7 16:04:54 np0005474864 systemd[1]: Starting Podman API Service...
Oct  7 16:04:54 np0005474864 systemd[1]: Started Podman API Service.
Oct  7 16:04:54 np0005474864 systemd[1]: Started podman_exporter container.
Oct  7 16:04:54 np0005474864 podman[206912]: time="2025-10-07T20:04:54Z" level=info msg="/usr/bin/podman filtering at log level info"
Oct  7 16:04:54 np0005474864 podman[206912]: time="2025-10-07T20:04:54Z" level=info msg="Setting parallel job count to 25"
Oct  7 16:04:54 np0005474864 podman[206912]: time="2025-10-07T20:04:54Z" level=info msg="Using sqlite as database backend"
Oct  7 16:04:54 np0005474864 podman[206912]: time="2025-10-07T20:04:54Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Oct  7 16:04:54 np0005474864 podman[206912]: time="2025-10-07T20:04:54Z" level=info msg="Using systemd socket activation to determine API endpoint"
Oct  7 16:04:54 np0005474864 podman[206912]: time="2025-10-07T20:04:54Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Oct  7 16:04:54 np0005474864 podman[206912]: @ - - [07/Oct/2025:20:04:54 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  7 16:04:54 np0005474864 podman[206912]: time="2025-10-07T20:04:54Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  7 16:04:54 np0005474864 podman[206891]: 2025-10-07 20:04:54.077100398 +0000 UTC m=+0.104392289 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  7 16:04:54 np0005474864 podman[206904]: 2025-10-07 20:04:54.086074298 +0000 UTC m=+0.070976720 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=starting, health_failing_streak=1, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:04:54 np0005474864 systemd[1]: b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193-4f21df046fc68725.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:04:54 np0005474864 systemd[1]: b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193-4f21df046fc68725.service: Failed with result 'exit-code'.
Oct  7 16:04:54 np0005474864 podman[206912]: @ - - [07/Oct/2025:20:04:54 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22058 "" "Go-http-client/1.1"
Oct  7 16:04:54 np0005474864 podman_exporter[206888]: ts=2025-10-07T20:04:54.105Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  7 16:04:54 np0005474864 podman_exporter[206888]: ts=2025-10-07T20:04:54.106Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  7 16:04:54 np0005474864 podman_exporter[206888]: ts=2025-10-07T20:04:54.106Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  7 16:04:55 np0005474864 python3.9[207102]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:04:55 np0005474864 systemd[1]: Stopping podman_exporter container...
Oct  7 16:04:55 np0005474864 systemd[1]: libpod-b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.scope: Deactivated successfully.
Oct  7 16:04:55 np0005474864 podman[206912]: @ - - [07/Oct/2025:20:04:54 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 3926 "" "Go-http-client/1.1"
Oct  7 16:04:55 np0005474864 podman[207106]: 2025-10-07 20:04:55.332332982 +0000 UTC m=+0.075328227 container died b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:04:55 np0005474864 systemd[1]: b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193-4f21df046fc68725.timer: Deactivated successfully.
Oct  7 16:04:55 np0005474864 systemd[1]: Stopped /usr/bin/podman healthcheck run b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.
Oct  7 16:04:55 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193-userdata-shm.mount: Deactivated successfully.
Oct  7 16:04:55 np0005474864 systemd[1]: var-lib-containers-storage-overlay-03a85ff8dde1676d2149b7e9c7431dd723038c9303c0662012777780e7305602-merged.mount: Deactivated successfully.
Oct  7 16:04:55 np0005474864 podman[207106]: 2025-10-07 20:04:55.937183683 +0000 UTC m=+0.680178898 container cleanup b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:04:55 np0005474864 podman[207106]: podman_exporter
Oct  7 16:04:55 np0005474864 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  7 16:04:56 np0005474864 podman[207135]: podman_exporter
Oct  7 16:04:56 np0005474864 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Oct  7 16:04:56 np0005474864 systemd[1]: Stopped podman_exporter container.
Oct  7 16:04:56 np0005474864 systemd[1]: Starting podman_exporter container...
Oct  7 16:04:56 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:04:56 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a85ff8dde1676d2149b7e9c7431dd723038c9303c0662012777780e7305602/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:56 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/03a85ff8dde1676d2149b7e9c7431dd723038c9303c0662012777780e7305602/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:04:56 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.
Oct  7 16:04:56 np0005474864 podman[207148]: 2025-10-07 20:04:56.236019474 +0000 UTC m=+0.169362435 container init b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:04:56 np0005474864 podman_exporter[207163]: ts=2025-10-07T20:04:56.256Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Oct  7 16:04:56 np0005474864 podman_exporter[207163]: ts=2025-10-07T20:04:56.256Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Oct  7 16:04:56 np0005474864 podman_exporter[207163]: ts=2025-10-07T20:04:56.256Z caller=handler.go:94 level=info msg="enabled collectors"
Oct  7 16:04:56 np0005474864 podman_exporter[207163]: ts=2025-10-07T20:04:56.256Z caller=handler.go:105 level=info collector=container
Oct  7 16:04:56 np0005474864 podman[206912]: @ - - [07/Oct/2025:20:04:56 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Oct  7 16:04:56 np0005474864 podman[206912]: time="2025-10-07T20:04:56Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Oct  7 16:04:56 np0005474864 podman[207148]: 2025-10-07 20:04:56.283656357 +0000 UTC m=+0.216999338 container start b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:04:56 np0005474864 podman[207148]: podman_exporter
Oct  7 16:04:56 np0005474864 systemd[1]: Started podman_exporter container.
Oct  7 16:04:56 np0005474864 podman[206912]: @ - - [07/Oct/2025:20:04:56 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 22060 "" "Go-http-client/1.1"
Oct  7 16:04:56 np0005474864 podman_exporter[207163]: ts=2025-10-07T20:04:56.307Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Oct  7 16:04:56 np0005474864 podman_exporter[207163]: ts=2025-10-07T20:04:56.307Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Oct  7 16:04:56 np0005474864 podman_exporter[207163]: ts=2025-10-07T20:04:56.308Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Oct  7 16:04:56 np0005474864 podman[207173]: 2025-10-07 20:04:56.373656658 +0000 UTC m=+0.083136373 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:04:57 np0005474864 python3.9[207350]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:04:57 np0005474864 python3.9[207473]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1759867496.6455302-2207-8241832953827/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Oct  7 16:04:58 np0005474864 python3.9[207625]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Oct  7 16:04:59 np0005474864 python3.9[207777]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Oct  7 16:05:00 np0005474864 podman[207802]: 2025-10-07 20:05:00.386197862 +0000 UTC m=+0.079983912 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  7 16:05:00 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:05:00 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Failed with result 'exit-code'.
Oct  7 16:05:00 np0005474864 python3[207946]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Oct  7 16:05:04 np0005474864 podman[207959]: 2025-10-07 20:05:04.759856123 +0000 UTC m=+3.765869446 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  7 16:05:04 np0005474864 podman[208057]: 2025-10-07 20:05:04.994228384 +0000 UTC m=+0.124322718 container create cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_id=edpm, managed_by=edpm_ansible)
Oct  7 16:05:04 np0005474864 podman[208057]: 2025-10-07 20:05:04.906987493 +0000 UTC m=+0.037081827 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  7 16:05:05 np0005474864 python3[207946]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Oct  7 16:05:06 np0005474864 python3.9[208247]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:05:07 np0005474864 python3.9[208401]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:05:07 np0005474864 python3.9[208552]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1759867507.158012-2365-259875387840707/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:05:08 np0005474864 python3.9[208628]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Oct  7 16:05:08 np0005474864 systemd[1]: Reloading.
Oct  7 16:05:08 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:05:08 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:05:09 np0005474864 python3.9[208740]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Oct  7 16:05:09 np0005474864 systemd[1]: Reloading.
Oct  7 16:05:09 np0005474864 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Oct  7 16:05:09 np0005474864 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Oct  7 16:05:09 np0005474864 systemd[1]: Starting openstack_network_exporter container...
Oct  7 16:05:10 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:05:10 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc70266aa66a4da15f0dc5953d4544103f1ed39592a81548f50b03f448fd80ed/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  7 16:05:10 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc70266aa66a4da15f0dc5953d4544103f1ed39592a81548f50b03f448fd80ed/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:05:10 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc70266aa66a4da15f0dc5953d4544103f1ed39592a81548f50b03f448fd80ed/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:05:10 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.
Oct  7 16:05:10 np0005474864 podman[208780]: 2025-10-07 20:05:10.195957655 +0000 UTC m=+0.175485084 container init cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, distribution-scope=public, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9)
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *bridge.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *coverage.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *datapath.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *iface.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *memory.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *ovnnorthd.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *ovn.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *ovsdbserver.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *pmd_perf.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *pmd_rxq.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: INFO    20:05:10 main.go:48: registering *vswitch.Collector
Oct  7 16:05:10 np0005474864 openstack_network_exporter[208795]: NOTICE  20:05:10 main.go:76: listening on https://:9105/metrics
Oct  7 16:05:10 np0005474864 podman[208780]: 2025-10-07 20:05:10.227650224 +0000 UTC m=+0.207177663 container start cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, architecture=x86_64, io.openshift.expose-services=, config_id=edpm, release=1755695350, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, maintainer=Red Hat, Inc.)
Oct  7 16:05:10 np0005474864 podman[208780]: openstack_network_exporter
Oct  7 16:05:10 np0005474864 systemd[1]: Started openstack_network_exporter container.
Oct  7 16:05:10 np0005474864 podman[208800]: 2025-10-07 20:05:10.380944182 +0000 UTC m=+0.127889212 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.33.7, maintainer=Red Hat, Inc.)
Oct  7 16:05:11 np0005474864 python3.9[208978]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Oct  7 16:05:11 np0005474864 systemd[1]: Stopping openstack_network_exporter container...
Oct  7 16:05:11 np0005474864 systemd[1]: libpod-cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.scope: Deactivated successfully.
Oct  7 16:05:11 np0005474864 podman[208982]: 2025-10-07 20:05:11.451342503 +0000 UTC m=+0.151999052 container died cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, maintainer=Red Hat, Inc., name=ubi9-minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, distribution-scope=public, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  7 16:05:11 np0005474864 systemd[1]: cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85-159d8576999939e0.timer: Deactivated successfully.
Oct  7 16:05:11 np0005474864 systemd[1]: Stopped /usr/bin/podman healthcheck run cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.
Oct  7 16:05:11 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85-userdata-shm.mount: Deactivated successfully.
Oct  7 16:05:11 np0005474864 systemd[1]: var-lib-containers-storage-overlay-bc70266aa66a4da15f0dc5953d4544103f1ed39592a81548f50b03f448fd80ed-merged.mount: Deactivated successfully.
Oct  7 16:05:12 np0005474864 podman[209011]: 2025-10-07 20:05:12.382642746 +0000 UTC m=+0.076655465 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:05:12 np0005474864 nova_compute[192593]: 2025-10-07 20:05:12.425 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:12 np0005474864 nova_compute[192593]: 2025-10-07 20:05:12.450 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:12 np0005474864 nova_compute[192593]: 2025-10-07 20:05:12.450 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.140 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.140 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.141 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.141 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.349 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.350 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5868MB free_disk=73.49810791015625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.351 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.351 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.443 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.444 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.482 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.502 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.505 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:05:13 np0005474864 nova_compute[192593]: 2025-10-07 20:05:13.506 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:05:13 np0005474864 podman[208982]: 2025-10-07 20:05:13.534515821 +0000 UTC m=+2.235172380 container cleanup cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.openshift.tags=minimal rhel9, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, vcs-type=git, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:05:13 np0005474864 podman[208982]: openstack_network_exporter
Oct  7 16:05:13 np0005474864 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Oct  7 16:05:13 np0005474864 podman[209035]: openstack_network_exporter
Oct  7 16:05:13 np0005474864 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Oct  7 16:05:13 np0005474864 systemd[1]: Stopped openstack_network_exporter container.
Oct  7 16:05:13 np0005474864 systemd[1]: Starting openstack_network_exporter container...
Oct  7 16:05:13 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:05:13 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc70266aa66a4da15f0dc5953d4544103f1ed39592a81548f50b03f448fd80ed/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Oct  7 16:05:13 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc70266aa66a4da15f0dc5953d4544103f1ed39592a81548f50b03f448fd80ed/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Oct  7 16:05:13 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc70266aa66a4da15f0dc5953d4544103f1ed39592a81548f50b03f448fd80ed/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Oct  7 16:05:13 np0005474864 systemd[1]: Started /usr/bin/podman healthcheck run cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.
Oct  7 16:05:13 np0005474864 podman[209048]: 2025-10-07 20:05:13.798009757 +0000 UTC m=+0.151133077 container init cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.buildah.version=1.33.7, release=1755695350, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, name=ubi9-minimal)
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *bridge.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *coverage.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *datapath.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *iface.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *memory.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *ovnnorthd.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *ovn.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *ovsdbserver.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *pmd_perf.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *pmd_rxq.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: INFO    20:05:13 main.go:48: registering *vswitch.Collector
Oct  7 16:05:13 np0005474864 openstack_network_exporter[209063]: NOTICE  20:05:13 main.go:76: listening on https://:9105/metrics
Oct  7 16:05:13 np0005474864 podman[209048]: 2025-10-07 20:05:13.825912827 +0000 UTC m=+0.179036097 container start cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, architecture=x86_64, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git)
Oct  7 16:05:13 np0005474864 podman[209048]: openstack_network_exporter
Oct  7 16:05:13 np0005474864 systemd[1]: Started openstack_network_exporter container.
Oct  7 16:05:13 np0005474864 podman[209073]: 2025-10-07 20:05:13.960736449 +0000 UTC m=+0.115451361 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, managed_by=edpm_ansible, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container)
Oct  7 16:05:14 np0005474864 nova_compute[192593]: 2025-10-07 20:05:14.502 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:14 np0005474864 nova_compute[192593]: 2025-10-07 20:05:14.502 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:14 np0005474864 nova_compute[192593]: 2025-10-07 20:05:14.503 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:05:14 np0005474864 nova_compute[192593]: 2025-10-07 20:05:14.503 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:05:14 np0005474864 nova_compute[192593]: 2025-10-07 20:05:14.663 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:05:14 np0005474864 nova_compute[192593]: 2025-10-07 20:05:14.664 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:05:14 np0005474864 python3.9[209245]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Oct  7 16:05:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:05:16.173 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:05:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:05:16.174 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:05:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:05:16.174 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:05:19 np0005474864 podman[209271]: 2025-10-07 20:05:19.369260279 +0000 UTC m=+0.068354675 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:05:19 np0005474864 podman[209270]: 2025-10-07 20:05:19.392175104 +0000 UTC m=+0.085710668 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:05:20 np0005474864 podman[209311]: 2025-10-07 20:05:20.404024355 +0000 UTC m=+0.093041191 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  7 16:05:24 np0005474864 podman[209337]: 2025-10-07 20:05:24.362879175 +0000 UTC m=+0.063471776 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:05:27 np0005474864 podman[209356]: 2025-10-07 20:05:27.379844388 +0000 UTC m=+0.068247873 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:05:31 np0005474864 podman[209382]: 2025-10-07 20:05:31.350871443 +0000 UTC m=+0.050277489 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  7 16:05:31 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:05:31 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Failed with result 'exit-code'.
Oct  7 16:05:44 np0005474864 podman[209403]: 2025-10-07 20:05:44.41280422 +0000 UTC m=+0.094235336 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, name=ubi9-minimal, vcs-type=git, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, config_id=edpm, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  7 16:05:44 np0005474864 podman[209402]: 2025-10-07 20:05:44.430800595 +0000 UTC m=+0.118078938 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:05:50 np0005474864 podman[209445]: 2025-10-07 20:05:50.386173374 +0000 UTC m=+0.079050943 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=iscsid)
Oct  7 16:05:50 np0005474864 podman[209446]: 2025-10-07 20:05:50.41751667 +0000 UTC m=+0.100599369 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  7 16:05:50 np0005474864 podman[209486]: 2025-10-07 20:05:50.553957702 +0000 UTC m=+0.105816487 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:05:55 np0005474864 podman[209512]: 2025-10-07 20:05:55.397795145 +0000 UTC m=+0.088679968 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 16:05:58 np0005474864 podman[209534]: 2025-10-07 20:05:58.381164749 +0000 UTC m=+0.073998377 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:06:02 np0005474864 podman[209559]: 2025-10-07 20:06:02.372602797 +0000 UTC m=+0.061431468 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  7 16:06:02 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:06:02 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Failed with result 'exit-code'.
Oct  7 16:06:12 np0005474864 nova_compute[192593]: 2025-10-07 20:06:12.100 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:13 np0005474864 nova_compute[192593]: 2025-10-07 20:06:13.102 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:14 np0005474864 nova_compute[192593]: 2025-10-07 20:06:14.089 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:14 np0005474864 nova_compute[192593]: 2025-10-07 20:06:14.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:14 np0005474864 nova_compute[192593]: 2025-10-07 20:06:14.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.098 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.099 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.100 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.124 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.124 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.125 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.125 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.151 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.152 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.152 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.152 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.379 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.380 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6013MB free_disk=73.4986801147461GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.381 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.381 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:06:15 np0005474864 podman[209578]: 2025-10-07 20:06:15.402903371 +0000 UTC m=+0.089155411 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:06:15 np0005474864 podman[209579]: 2025-10-07 20:06:15.404065224 +0000 UTC m=+0.088753989 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, release=1755695350, version=9.6, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9)
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.458 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.458 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.484 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.502 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.504 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:06:15 np0005474864 nova_compute[192593]: 2025-10-07 20:06:15.504 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:06:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:06:16.174 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:06:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:06:16.175 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:06:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:06:16.175 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:06:16 np0005474864 nova_compute[192593]: 2025-10-07 20:06:16.473 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:06:19 np0005474864 python3.9[209748]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Oct  7 16:06:20 np0005474864 python3.9[209913]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:20 np0005474864 systemd[1]: Started libpod-conmon-a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a.scope.
Oct  7 16:06:20 np0005474864 podman[209914]: 2025-10-07 20:06:20.582633919 +0000 UTC m=+0.107283369 container exec a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:06:20 np0005474864 podman[209914]: 2025-10-07 20:06:20.616705203 +0000 UTC m=+0.141354653 container exec_died a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:06:20 np0005474864 systemd[1]: libpod-conmon-a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a.scope: Deactivated successfully.
Oct  7 16:06:20 np0005474864 podman[209931]: 2025-10-07 20:06:20.670141701 +0000 UTC m=+0.084091356 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  7 16:06:20 np0005474864 podman[209934]: 2025-10-07 20:06:20.698193263 +0000 UTC m=+0.110017817 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:06:20 np0005474864 podman[209933]: 2025-10-07 20:06:20.709998261 +0000 UTC m=+0.117357177 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd)
Oct  7 16:06:21 np0005474864 python3.9[210157]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:21 np0005474864 systemd[1]: Started libpod-conmon-a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a.scope.
Oct  7 16:06:21 np0005474864 podman[210158]: 2025-10-07 20:06:21.644288038 +0000 UTC m=+0.111349255 container exec a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  7 16:06:21 np0005474864 podman[210158]: 2025-10-07 20:06:21.6817726 +0000 UTC m=+0.148833837 container exec_died a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Oct  7 16:06:21 np0005474864 systemd[1]: libpod-conmon-a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a.scope: Deactivated successfully.
Oct  7 16:06:22 np0005474864 python3.9[210342]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:23 np0005474864 python3.9[210494]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Oct  7 16:06:24 np0005474864 python3.9[210660]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:24 np0005474864 systemd[1]: Started libpod-conmon-a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c.scope.
Oct  7 16:06:24 np0005474864 podman[210661]: 2025-10-07 20:06:24.631548131 +0000 UTC m=+0.101779021 container exec a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:06:24 np0005474864 podman[210661]: 2025-10-07 20:06:24.662312171 +0000 UTC m=+0.132543071 container exec_died a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  7 16:06:24 np0005474864 systemd[1]: libpod-conmon-a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c.scope: Deactivated successfully.
Oct  7 16:06:25 np0005474864 python3.9[210843]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:25 np0005474864 systemd[1]: Started libpod-conmon-a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c.scope.
Oct  7 16:06:25 np0005474864 podman[210844]: 2025-10-07 20:06:25.661817953 +0000 UTC m=+0.109507763 container exec a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  7 16:06:25 np0005474864 podman[210844]: 2025-10-07 20:06:25.69982072 +0000 UTC m=+0.147510580 container exec_died a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 16:06:25 np0005474864 podman[210860]: 2025-10-07 20:06:25.771304884 +0000 UTC m=+0.100232637 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:06:25 np0005474864 systemd[1]: libpod-conmon-a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c.scope: Deactivated successfully.
Oct  7 16:06:26 np0005474864 python3.9[211049]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:27 np0005474864 python3.9[211201]: ansible-containers.podman.podman_container_info Invoked with name=['iscsid'] executable=podman
Oct  7 16:06:28 np0005474864 python3.9[211366]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:28 np0005474864 systemd[1]: Started libpod-conmon-6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b.scope.
Oct  7 16:06:28 np0005474864 podman[211367]: 2025-10-07 20:06:28.631821485 +0000 UTC m=+0.111611524 container exec 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  7 16:06:28 np0005474864 podman[211367]: 2025-10-07 20:06:28.670051634 +0000 UTC m=+0.149841633 container exec_died 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:06:28 np0005474864 systemd[1]: libpod-conmon-6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b.scope: Deactivated successfully.
Oct  7 16:06:28 np0005474864 podman[211383]: 2025-10-07 20:06:28.734096064 +0000 UTC m=+0.098194389 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:06:29 np0005474864 python3.9[211569]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=iscsid detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:29 np0005474864 systemd[1]: Started libpod-conmon-6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b.scope.
Oct  7 16:06:29 np0005474864 podman[211570]: 2025-10-07 20:06:29.60230125 +0000 UTC m=+0.100959715 container exec 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:06:29 np0005474864 podman[211570]: 2025-10-07 20:06:29.638676688 +0000 UTC m=+0.137335153 container exec_died 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:06:29 np0005474864 systemd[1]: libpod-conmon-6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b.scope: Deactivated successfully.
Oct  7 16:06:30 np0005474864 python3.9[211751]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/iscsid recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.250 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:06:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:06:31 np0005474864 python3.9[211903]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Oct  7 16:06:32 np0005474864 python3.9[212069]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:32 np0005474864 systemd[1]: Started libpod-conmon-ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.scope.
Oct  7 16:06:32 np0005474864 podman[212070]: 2025-10-07 20:06:32.539536714 +0000 UTC m=+0.100086061 container exec ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:06:32 np0005474864 podman[212070]: 2025-10-07 20:06:32.572025667 +0000 UTC m=+0.132574984 container exec_died ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd)
Oct  7 16:06:32 np0005474864 systemd[1]: libpod-conmon-ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.scope: Deactivated successfully.
Oct  7 16:06:32 np0005474864 podman[212087]: 2025-10-07 20:06:32.63172609 +0000 UTC m=+0.076236563 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=5, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:06:32 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Main process exited, code=exited, status=1/FAILURE
Oct  7 16:06:32 np0005474864 systemd[1]: b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a-24852d920216a452.service: Failed with result 'exit-code'.
Oct  7 16:06:33 np0005474864 python3.9[212268]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:33 np0005474864 systemd[1]: Started libpod-conmon-ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.scope.
Oct  7 16:06:33 np0005474864 podman[212269]: 2025-10-07 20:06:33.527939926 +0000 UTC m=+0.084995940 container exec ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:06:33 np0005474864 podman[212269]: 2025-10-07 20:06:33.563855863 +0000 UTC m=+0.120911877 container exec_died ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3)
Oct  7 16:06:33 np0005474864 systemd[1]: libpod-conmon-ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434.scope: Deactivated successfully.
Oct  7 16:06:34 np0005474864 python3.9[212453]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:35 np0005474864 python3.9[212605]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Oct  7 16:06:36 np0005474864 systemd[1]: packagekit.service: Deactivated successfully.
Oct  7 16:06:36 np0005474864 python3.9[212770]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:36 np0005474864 systemd[1]: Started libpod-conmon-b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.scope.
Oct  7 16:06:36 np0005474864 podman[212771]: 2025-10-07 20:06:36.321865936 +0000 UTC m=+0.094133169 container exec b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:06:36 np0005474864 podman[212771]: 2025-10-07 20:06:36.332563247 +0000 UTC m=+0.104830470 container exec_died b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  7 16:06:36 np0005474864 systemd[1]: libpod-conmon-b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.scope: Deactivated successfully.
Oct  7 16:06:37 np0005474864 python3.9[212954]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:37 np0005474864 systemd[1]: Started libpod-conmon-b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.scope.
Oct  7 16:06:37 np0005474864 podman[212955]: 2025-10-07 20:06:37.194911933 +0000 UTC m=+0.095682661 container exec b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:06:37 np0005474864 podman[212955]: 2025-10-07 20:06:37.225065533 +0000 UTC m=+0.125836201 container exec_died b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm)
Oct  7 16:06:37 np0005474864 systemd[1]: libpod-conmon-b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a.scope: Deactivated successfully.
Oct  7 16:06:38 np0005474864 python3.9[213137]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:38 np0005474864 python3.9[213289]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Oct  7 16:06:39 np0005474864 python3.9[213455]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:39 np0005474864 systemd[1]: Started libpod-conmon-52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.scope.
Oct  7 16:06:40 np0005474864 podman[213456]: 2025-10-07 20:06:40.006455342 +0000 UTC m=+0.115096509 container exec 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:06:40 np0005474864 podman[213456]: 2025-10-07 20:06:40.042964694 +0000 UTC m=+0.151605861 container exec_died 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:06:40 np0005474864 systemd[1]: libpod-conmon-52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.scope: Deactivated successfully.
Oct  7 16:06:40 np0005474864 python3.9[213638]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:41 np0005474864 systemd[1]: Started libpod-conmon-52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.scope.
Oct  7 16:06:41 np0005474864 podman[213639]: 2025-10-07 20:06:41.085695173 +0000 UTC m=+0.086900693 container exec 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:06:41 np0005474864 podman[213639]: 2025-10-07 20:06:41.122835562 +0000 UTC m=+0.124041082 container exec_died 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:06:41 np0005474864 systemd[1]: libpod-conmon-52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815.scope: Deactivated successfully.
Oct  7 16:06:42 np0005474864 python3.9[213823]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:42 np0005474864 python3.9[213975]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Oct  7 16:06:43 np0005474864 python3.9[214140]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:44 np0005474864 systemd[1]: Started libpod-conmon-b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.scope.
Oct  7 16:06:44 np0005474864 podman[214141]: 2025-10-07 20:06:44.05358125 +0000 UTC m=+0.094606742 container exec b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:06:44 np0005474864 podman[214141]: 2025-10-07 20:06:44.087753029 +0000 UTC m=+0.128778481 container exec_died b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:06:44 np0005474864 systemd[1]: libpod-conmon-b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.scope: Deactivated successfully.
Oct  7 16:06:44 np0005474864 python3.9[214325]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:45 np0005474864 systemd[1]: Started libpod-conmon-b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.scope.
Oct  7 16:06:45 np0005474864 podman[214326]: 2025-10-07 20:06:45.090177511 +0000 UTC m=+0.091271431 container exec b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:06:45 np0005474864 podman[214326]: 2025-10-07 20:06:45.125998655 +0000 UTC m=+0.127092525 container exec_died b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:06:45 np0005474864 systemd[1]: libpod-conmon-b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193.scope: Deactivated successfully.
Oct  7 16:06:45 np0005474864 podman[214478]: 2025-10-07 20:06:45.820849979 +0000 UTC m=+0.091882158 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:06:45 np0005474864 podman[214479]: 2025-10-07 20:06:45.831872498 +0000 UTC m=+0.090499360 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, config_id=edpm, maintainer=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  7 16:06:46 np0005474864 python3.9[214548]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:46 np0005474864 python3.9[214704]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Oct  7 16:06:47 np0005474864 python3.9[214870]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:48 np0005474864 systemd[1]: Started libpod-conmon-cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.scope.
Oct  7 16:06:48 np0005474864 podman[214871]: 2025-10-07 20:06:48.055707216 +0000 UTC m=+0.106843035 container exec cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, io.openshift.expose-services=, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, build-date=2025-08-20T13:12:41, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, version=9.6, architecture=x86_64, distribution-scope=public)
Oct  7 16:06:48 np0005474864 podman[214871]: 2025-10-07 20:06:48.091823947 +0000 UTC m=+0.142959756 container exec_died cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, io.buildah.version=1.33.7, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  7 16:06:48 np0005474864 systemd[1]: libpod-conmon-cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.scope: Deactivated successfully.
Oct  7 16:06:48 np0005474864 python3.9[215054]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Oct  7 16:06:49 np0005474864 systemd[1]: Started libpod-conmon-cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.scope.
Oct  7 16:06:49 np0005474864 podman[215055]: 2025-10-07 20:06:49.046091301 +0000 UTC m=+0.079203994 container exec cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc.)
Oct  7 16:06:49 np0005474864 podman[215055]: 2025-10-07 20:06:49.077084983 +0000 UTC m=+0.110197726 container exec_died cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, release=1755695350, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6)
Oct  7 16:06:49 np0005474864 systemd[1]: libpod-conmon-cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85.scope: Deactivated successfully.
Oct  7 16:06:49 np0005474864 python3.9[215239]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:50 np0005474864 python3.9[215391]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:51 np0005474864 podman[215517]: 2025-10-07 20:06:51.42931219 +0000 UTC m=+0.106468215 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  7 16:06:51 np0005474864 podman[215508]: 2025-10-07 20:06:51.454046542 +0000 UTC m=+0.133364996 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:06:51 np0005474864 podman[215515]: 2025-10-07 20:06:51.456606281 +0000 UTC m=+0.135597906 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller)
Oct  7 16:06:51 np0005474864 python3.9[215601]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:06:52 np0005474864 python3.9[215730]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1759867611.0661385-3308-94259071555087/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:53 np0005474864 python3.9[215882]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:54 np0005474864 python3.9[216034]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:06:54 np0005474864 python3.9[216112]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:55 np0005474864 python3.9[216264]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:06:56 np0005474864 podman[216314]: 2025-10-07 20:06:56.124639803 +0000 UTC m=+0.093351128 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  7 16:06:56 np0005474864 python3.9[216359]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.vjouco9p recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:57 np0005474864 python3.9[216513]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:06:57 np0005474864 python3.9[216591]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:06:58 np0005474864 python3.9[216743]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:06:59 np0005474864 podman[216821]: 2025-10-07 20:06:59.40489695 +0000 UTC m=+0.089121383 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:06:59 np0005474864 python3[216922]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Oct  7 16:07:00 np0005474864 python3.9[217074]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:07:02 np0005474864 python3.9[217152]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:03 np0005474864 podman[217276]: 2025-10-07 20:07:03.199724291 +0000 UTC m=+0.123177128 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:07:03 np0005474864 python3.9[217321]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:07:03 np0005474864 python3.9[217403]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:04 np0005474864 python3.9[217555]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:07:05 np0005474864 python3.9[217633]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:06 np0005474864 python3.9[217785]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:07:07 np0005474864 python3.9[217863]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:07 np0005474864 python3.9[218015]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True
Oct  7 16:07:08 np0005474864 python3.9[218140]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1759867627.332998-3683-44919651653184/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:09 np0005474864 python3.9[218292]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:10 np0005474864 python3.9[218444]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:07:11 np0005474864 python3.9[218599]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:12 np0005474864 nova_compute[192593]: 2025-10-07 20:07:12.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:12 np0005474864 python3.9[218751]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:07:13 np0005474864 python3.9[218904]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Oct  7 16:07:14 np0005474864 nova_compute[192593]: 2025-10-07 20:07:14.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:14 np0005474864 nova_compute[192593]: 2025-10-07 20:07:14.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:14 np0005474864 python3.9[219058]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.145 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.146 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.146 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.146 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.345 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.345 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5957MB free_disk=73.50153732299805GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.346 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.346 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.415 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.416 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.445 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.463 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.466 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:07:15 np0005474864 nova_compute[192593]: 2025-10-07 20:07:15.467 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:07:15 np0005474864 python3.9[219213]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Oct  7 16:07:16 np0005474864 systemd[1]: session-29.scope: Deactivated successfully.
Oct  7 16:07:16 np0005474864 systemd[1]: session-29.scope: Consumed 2min 4.359s CPU time.
Oct  7 16:07:16 np0005474864 systemd-logind[805]: Session 29 logged out. Waiting for processes to exit.
Oct  7 16:07:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:07:16.176 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:07:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:07:16.177 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:07:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:07:16.177 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:07:16 np0005474864 systemd-logind[805]: Removed session 29.
Oct  7 16:07:16 np0005474864 podman[219238]: 2025-10-07 20:07:16.297614722 +0000 UTC m=+0.093211484 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:07:16 np0005474864 podman[219239]: 2025-10-07 20:07:16.315481748 +0000 UTC m=+0.098366965 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, release=1755695350, io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., name=ubi9-minimal, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:07:16 np0005474864 nova_compute[192593]: 2025-10-07 20:07:16.466 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:16 np0005474864 nova_compute[192593]: 2025-10-07 20:07:16.466 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:16 np0005474864 nova_compute[192593]: 2025-10-07 20:07:16.467 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:07:17 np0005474864 nova_compute[192593]: 2025-10-07 20:07:17.089 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:17 np0005474864 nova_compute[192593]: 2025-10-07 20:07:17.112 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:17 np0005474864 nova_compute[192593]: 2025-10-07 20:07:17.112 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:07:17 np0005474864 nova_compute[192593]: 2025-10-07 20:07:17.112 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:07:17 np0005474864 nova_compute[192593]: 2025-10-07 20:07:17.140 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:07:18 np0005474864 nova_compute[192593]: 2025-10-07 20:07:18.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:07:22 np0005474864 podman[219282]: 2025-10-07 20:07:22.383833326 +0000 UTC m=+0.080491968 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:07:22 np0005474864 podman[219284]: 2025-10-07 20:07:22.418501339 +0000 UTC m=+0.097512872 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:07:22 np0005474864 podman[219283]: 2025-10-07 20:07:22.44835481 +0000 UTC m=+0.141377943 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  7 16:07:26 np0005474864 podman[219346]: 2025-10-07 20:07:26.388806477 +0000 UTC m=+0.082046701 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:07:30 np0005474864 podman[219367]: 2025-10-07 20:07:30.387804788 +0000 UTC m=+0.082835822 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:07:33 np0005474864 podman[219391]: 2025-10-07 20:07:33.430174959 +0000 UTC m=+0.125752640 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  7 16:07:47 np0005474864 podman[219412]: 2025-10-07 20:07:47.364121109 +0000 UTC m=+0.052670081 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, distribution-scope=public, name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, vcs-type=git)
Oct  7 16:07:47 np0005474864 podman[219411]: 2025-10-07 20:07:47.370312024 +0000 UTC m=+0.064850456 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:07:53 np0005474864 podman[219455]: 2025-10-07 20:07:53.421891315 +0000 UTC m=+0.102943415 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:07:53 np0005474864 podman[219457]: 2025-10-07 20:07:53.448007944 +0000 UTC m=+0.115289094 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  7 16:07:53 np0005474864 podman[219456]: 2025-10-07 20:07:53.475721068 +0000 UTC m=+0.149645646 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:07:57 np0005474864 podman[219520]: 2025-10-07 20:07:57.359662082 +0000 UTC m=+0.063046075 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:08:00 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:08:00.079 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:08:00 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:08:00.081 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:08:00 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:08:00.082 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:08:01 np0005474864 podman[219539]: 2025-10-07 20:08:01.372672389 +0000 UTC m=+0.067726047 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:08:04 np0005474864 podman[219563]: 2025-10-07 20:08:04.372608224 +0000 UTC m=+0.070715423 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:08:12 np0005474864 nova_compute[192593]: 2025-10-07 20:08:12.095 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:12 np0005474864 nova_compute[192593]: 2025-10-07 20:08:12.098 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:12 np0005474864 nova_compute[192593]: 2025-10-07 20:08:12.099 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  7 16:08:12 np0005474864 nova_compute[192593]: 2025-10-07 20:08:12.151 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  7 16:08:12 np0005474864 nova_compute[192593]: 2025-10-07 20:08:12.153 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:12 np0005474864 nova_compute[192593]: 2025-10-07 20:08:12.154 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  7 16:08:12 np0005474864 nova_compute[192593]: 2025-10-07 20:08:12.176 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.192 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.194 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.194 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.251 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.252 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.253 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.253 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.497 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.498 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6050MB free_disk=73.5015754699707GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.499 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.499 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.677 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.678 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.735 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing inventories for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.757 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating ProviderTree inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.758 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.810 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing aggregate associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.828 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing trait associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.855 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.872 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.875 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:08:15 np0005474864 nova_compute[192593]: 2025-10-07 20:08:15.875 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:08:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:08:16.178 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:08:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:08:16.179 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:08:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:08:16.179 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:08:16 np0005474864 nova_compute[192593]: 2025-10-07 20:08:16.774 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:17 np0005474864 nova_compute[192593]: 2025-10-07 20:08:17.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:18 np0005474864 nova_compute[192593]: 2025-10-07 20:08:18.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:18 np0005474864 nova_compute[192593]: 2025-10-07 20:08:18.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:08:18 np0005474864 nova_compute[192593]: 2025-10-07 20:08:18.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:08:18 np0005474864 nova_compute[192593]: 2025-10-07 20:08:18.123 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:08:18 np0005474864 nova_compute[192593]: 2025-10-07 20:08:18.124 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:18 np0005474864 nova_compute[192593]: 2025-10-07 20:08:18.124 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:08:18 np0005474864 nova_compute[192593]: 2025-10-07 20:08:18.125 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:08:18 np0005474864 podman[219585]: 2025-10-07 20:08:18.397879757 +0000 UTC m=+0.097101929 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:08:18 np0005474864 podman[219586]: 2025-10-07 20:08:18.40361387 +0000 UTC m=+0.088776144 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, version=9.6, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.openshift.tags=minimal rhel9)
Oct  7 16:08:24 np0005474864 podman[219630]: 2025-10-07 20:08:24.398970712 +0000 UTC m=+0.079651116 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:08:24 np0005474864 podman[219632]: 2025-10-07 20:08:24.420852211 +0000 UTC m=+0.097500851 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:08:24 np0005474864 podman[219631]: 2025-10-07 20:08:24.480683184 +0000 UTC m=+0.157614032 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:08:28 np0005474864 podman[219694]: 2025-10-07 20:08:28.361031107 +0000 UTC m=+0.063222090 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.251 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.252 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:08:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:08:32 np0005474864 podman[219714]: 2025-10-07 20:08:32.339169636 +0000 UTC m=+0.041974588 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:08:35 np0005474864 podman[219738]: 2025-10-07 20:08:35.409647738 +0000 UTC m=+0.090023789 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:08:49 np0005474864 podman[219761]: 2025-10-07 20:08:49.386761274 +0000 UTC m=+0.064261437 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, release=1755695350, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Oct  7 16:08:49 np0005474864 podman[219760]: 2025-10-07 20:08:49.411224356 +0000 UTC m=+0.097910473 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:08:55 np0005474864 podman[219804]: 2025-10-07 20:08:55.376917813 +0000 UTC m=+0.068511380 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:08:55 np0005474864 podman[219806]: 2025-10-07 20:08:55.389831173 +0000 UTC m=+0.069897838 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:08:55 np0005474864 podman[219805]: 2025-10-07 20:08:55.424404826 +0000 UTC m=+0.110150535 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:08:59 np0005474864 podman[219868]: 2025-10-07 20:08:59.377326616 +0000 UTC m=+0.071872105 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  7 16:09:03 np0005474864 podman[219888]: 2025-10-07 20:09:03.357320912 +0000 UTC m=+0.052797048 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:09:06 np0005474864 podman[219913]: 2025-10-07 20:09:06.396655553 +0000 UTC m=+0.080584276 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0)
Oct  7 16:09:14 np0005474864 nova_compute[192593]: 2025-10-07 20:09:14.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.138 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.139 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.139 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.140 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.342 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.343 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6076MB free_disk=73.50212478637695GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.344 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.344 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.499 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.499 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.525 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.540 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.542 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:09:15 np0005474864 nova_compute[192593]: 2025-10-07 20:09:15.542 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.198s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:09:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:09:16.180 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:09:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:09:16.180 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:09:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:09:16.180 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:09:16 np0005474864 nova_compute[192593]: 2025-10-07 20:09:16.542 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:17 np0005474864 nova_compute[192593]: 2025-10-07 20:09:17.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:17 np0005474864 nova_compute[192593]: 2025-10-07 20:09:17.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:19 np0005474864 nova_compute[192593]: 2025-10-07 20:09:19.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:19 np0005474864 nova_compute[192593]: 2025-10-07 20:09:19.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:09:19 np0005474864 nova_compute[192593]: 2025-10-07 20:09:19.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:09:19 np0005474864 nova_compute[192593]: 2025-10-07 20:09:19.229 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:09:19 np0005474864 nova_compute[192593]: 2025-10-07 20:09:19.229 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:20 np0005474864 nova_compute[192593]: 2025-10-07 20:09:20.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:20 np0005474864 nova_compute[192593]: 2025-10-07 20:09:20.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:20 np0005474864 nova_compute[192593]: 2025-10-07 20:09:20.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:09:20 np0005474864 podman[219933]: 2025-10-07 20:09:20.392856157 +0000 UTC m=+0.082103200 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:09:20 np0005474864 podman[219934]: 2025-10-07 20:09:20.404369268 +0000 UTC m=+0.086820285 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9)
Oct  7 16:09:21 np0005474864 nova_compute[192593]: 2025-10-07 20:09:21.089 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:09:26 np0005474864 podman[219976]: 2025-10-07 20:09:26.436678648 +0000 UTC m=+0.126568626 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:09:26 np0005474864 podman[219978]: 2025-10-07 20:09:26.452638517 +0000 UTC m=+0.126231327 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd)
Oct  7 16:09:26 np0005474864 podman[219977]: 2025-10-07 20:09:26.488792315 +0000 UTC m=+0.169643854 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:09:30 np0005474864 podman[220041]: 2025-10-07 20:09:30.410120387 +0000 UTC m=+0.093390304 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 16:09:34 np0005474864 podman[220063]: 2025-10-07 20:09:34.391732119 +0000 UTC m=+0.080848584 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:09:37 np0005474864 podman[220088]: 2025-10-07 20:09:37.40281142 +0000 UTC m=+0.093878127 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  7 16:09:39 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:09:39.892 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:09:39 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:09:39.893 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:09:43 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:09:43.895 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:09:51 np0005474864 podman[220109]: 2025-10-07 20:09:51.410054073 +0000 UTC m=+0.100363894 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:09:51 np0005474864 podman[220110]: 2025-10-07 20:09:51.410717512 +0000 UTC m=+0.093078885 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, container_name=openstack_network_exporter, config_id=edpm, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, vcs-type=git, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  7 16:09:57 np0005474864 podman[220155]: 2025-10-07 20:09:57.391328078 +0000 UTC m=+0.074340916 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 16:09:57 np0005474864 podman[220157]: 2025-10-07 20:09:57.402296993 +0000 UTC m=+0.081961945 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd)
Oct  7 16:09:57 np0005474864 podman[220156]: 2025-10-07 20:09:57.460540866 +0000 UTC m=+0.140209558 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:10:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.303 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3c:bb:b5 10.100.0.2 2001:db8::f816:3eff:fe3c:bbb5'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3c:bbb5/64', 'neutron:device_id': 'ovnmeta-66c1e9d1-f7c9-4177-8879-f2fdc9afb323', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c1e9d1-f7c9-4177-8879-f2fdc9afb323', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=12cd89e6-df45-491a-aa2e-876dbaa8c4ca, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=b09f7fdc-1192-4e98-9ac4-3f20a8438833) old=Port_Binding(mac=['fa:16:3e:3c:bb:b5 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-66c1e9d1-f7c9-4177-8879-f2fdc9afb323', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-66c1e9d1-f7c9-4177-8879-f2fdc9afb323', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:10:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.306 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port b09f7fdc-1192-4e98-9ac4-3f20a8438833 in datapath 66c1e9d1-f7c9-4177-8879-f2fdc9afb323 updated#033[00m
Oct  7 16:10:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.310 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 66c1e9d1-f7c9-4177-8879-f2fdc9afb323, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:10:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.312 103685 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpwa8twj7a/privsep.sock']#033[00m
Oct  7 16:10:01 np0005474864 podman[220220]: 2025-10-07 20:10:01.400590565 +0000 UTC m=+0.086745053 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:02.050 103685 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:02.052 103685 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpwa8twj7a/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.897 220243 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.905 220243 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.908 220243 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:01.908 220243 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220243#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:02.056 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fa8b0b70-9a5c-419d-beb3-cd84b5beb0ff]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:02.590 220243 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:02.590 220243 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:02.590 220243 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:02.699 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1f862241-ffae-4254-bb83-01d2421e0948]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:04 np0005474864 nova_compute[192593]: 2025-10-07 20:10:04.902 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:04 np0005474864 nova_compute[192593]: 2025-10-07 20:10:04.903 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:04 np0005474864 nova_compute[192593]: 2025-10-07 20:10:04.928 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:04 np0005474864 nova_compute[192593]: 2025-10-07 20:10:04.929 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:04 np0005474864 nova_compute[192593]: 2025-10-07 20:10:04.931 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:10:04 np0005474864 nova_compute[192593]: 2025-10-07 20:10:04.973 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.047 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.048 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.063 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.064 2 INFO nova.compute.claims [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.072 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.218 2 DEBUG nova.compute.provider_tree [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.235 2 DEBUG nova.scheduler.client.report [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.259 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.261 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.265 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.193s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.274 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.275 2 INFO nova.compute.claims [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.322 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.323 2 DEBUG nova.network.neutron [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.349 2 INFO nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.369 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:10:05 np0005474864 podman[220248]: 2025-10-07 20:10:05.371738157 +0000 UTC m=+0.066330986 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.417 2 DEBUG nova.compute.provider_tree [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.449 2 DEBUG nova.scheduler.client.report [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.465 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.467 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.468 2 INFO nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Creating image(s)#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.469 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "/var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.469 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.471 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.471 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.472 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.476 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.211s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.477 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.527 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.528 2 DEBUG nova.network.neutron [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.542 2 INFO nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.559 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.652 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.654 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.655 2 INFO nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Creating image(s)#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.656 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.657 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.658 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:05 np0005474864 nova_compute[192593]: 2025-10-07 20:10:05.659 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:06 np0005474864 nova_compute[192593]: 2025-10-07 20:10:06.102 2 WARNING oslo_policy.policy [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  7 16:10:06 np0005474864 nova_compute[192593]: 2025-10-07 20:10:06.103 2 WARNING oslo_policy.policy [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Oct  7 16:10:06 np0005474864 nova_compute[192593]: 2025-10-07 20:10:06.105 2 DEBUG nova.policy [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:10:06 np0005474864 nova_compute[192593]: 2025-10-07 20:10:06.389 2 DEBUG nova.policy [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.496 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.580 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.part --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.582 2 DEBUG nova.virt.images [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] 3c70ce5f-6f9a-4def-9c79-e5a33d631679 was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.584 2 DEBUG nova.privsep.utils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.584 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.part /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.780 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.part /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.converted" returned: 0 in 0.196s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.790 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.865 2 DEBUG nova.network.neutron [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Successfully created port: 9283f59d-4eb5-4e2a-876d-b078582f6dec _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.874 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b.converted --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.875 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.403s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.900 2 INFO oslo.privsep.daemon [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmptgfnzlc8/privsep.sock']#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.901 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 2.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:07 np0005474864 nova_compute[192593]: 2025-10-07 20:10:07.902 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:08 np0005474864 podman[220292]: 2025-10-07 20:10:08.396571741 +0000 UTC m=+0.087017670 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.662 2 INFO oslo.privsep.daemon [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.515 55 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.523 55 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.527 55 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.527 55 INFO oslo.privsep.daemon [-] privsep daemon running as pid 55#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.666 2 WARNING oslo_privsep.priv_context [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] privsep daemon already running#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.754 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.779 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.842 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.844 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.846 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.876 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.899 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.119s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.900 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.966 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:08 np0005474864 nova_compute[192593]: 2025-10-07 20:10:08.968 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.012 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.015 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.016 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.037 2 DEBUG nova.network.neutron [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Successfully created port: 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.041 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.055 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.082 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.083 2 DEBUG nova.virt.disk.api [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Checking if we can resize image /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.083 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.133 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.134 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.153 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.154 2 DEBUG nova.virt.disk.api [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Cannot resize image /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.155 2 DEBUG nova.objects.instance [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'migration_context' on Instance uuid 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.171 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.172 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Ensure instance console log exists: /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.172 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.173 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.173 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.174 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.175 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.134s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.175 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.242 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.244 2 DEBUG nova.virt.disk.api [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Checking if we can resize image /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.245 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.335 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json" returned: 0 in 0.091s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.337 2 DEBUG nova.virt.disk.api [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Cannot resize image /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.338 2 DEBUG nova.objects.instance [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'migration_context' on Instance uuid 31cd065b-2fe3-418f-869b-a5ac7f4405f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.369 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.370 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Ensure instance console log exists: /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.371 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.371 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:09 np0005474864 nova_compute[192593]: 2025-10-07 20:10:09.372 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:10 np0005474864 nova_compute[192593]: 2025-10-07 20:10:10.604 2 DEBUG nova.network.neutron [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Successfully updated port: 9283f59d-4eb5-4e2a-876d-b078582f6dec _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:10:10 np0005474864 nova_compute[192593]: 2025-10-07 20:10:10.619 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:10:10 np0005474864 nova_compute[192593]: 2025-10-07 20:10:10.620 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquired lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:10:10 np0005474864 nova_compute[192593]: 2025-10-07 20:10:10.620 2 DEBUG nova.network.neutron [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:10:11 np0005474864 nova_compute[192593]: 2025-10-07 20:10:11.048 2 DEBUG nova.network.neutron [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:10:11 np0005474864 nova_compute[192593]: 2025-10-07 20:10:11.118 2 DEBUG nova.compute.manager [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-changed-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:11 np0005474864 nova_compute[192593]: 2025-10-07 20:10:11.119 2 DEBUG nova.compute.manager [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Refreshing instance network info cache due to event network-changed-9283f59d-4eb5-4e2a-876d-b078582f6dec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:10:11 np0005474864 nova_compute[192593]: 2025-10-07 20:10:11.119 2 DEBUG oslo_concurrency.lockutils [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:10:12 np0005474864 nova_compute[192593]: 2025-10-07 20:10:12.062 2 DEBUG nova.network.neutron [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Successfully updated port: 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:10:12 np0005474864 nova_compute[192593]: 2025-10-07 20:10:12.146 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:10:12 np0005474864 nova_compute[192593]: 2025-10-07 20:10:12.146 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquired lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:10:12 np0005474864 nova_compute[192593]: 2025-10-07 20:10:12.147 2 DEBUG nova.network.neutron [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:10:12 np0005474864 nova_compute[192593]: 2025-10-07 20:10:12.311 2 DEBUG nova.network.neutron [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.042 2 DEBUG nova.network.neutron [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updating instance_info_cache with network_info: [{"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.074 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Releasing lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.074 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Instance network_info: |[{"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.078 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Start _get_guest_xml network_info=[{"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.085 2 WARNING nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.096 2 DEBUG nova.virt.libvirt.host [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.096 2 DEBUG nova.virt.libvirt.host [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.107 2 DEBUG nova.virt.libvirt.host [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.108 2 DEBUG nova.virt.libvirt.host [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.110 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.110 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.111 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.111 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.111 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.112 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.112 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.112 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.112 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.113 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.113 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.113 2 DEBUG nova.virt.hardware [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.118 2 DEBUG nova.privsep.utils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.119 2 DEBUG nova.virt.libvirt.vif [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1247597446',display_name='tempest-TestNetworkBasicOps-server-1247597446',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1247597446',id=2,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEYMbYS6FykH8XwBy+xJhBj+DroVNsweQH8/yrZNy2DdnGZ8U7ITpQhcHiv47cPc+C9zUx61bpZMf9xi1jJuzMTLouiDNZyx+sOCB1Md+ZKzM9kBZzk7412n9ZJH/ZBT6Q==',key_name='tempest-TestNetworkBasicOps-1328202832',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-9t6cko4x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:10:05Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=3aa55e8a-0c2d-4f7b-aac0-c393e35ec679,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.120 2 DEBUG nova.network.os_vif_util [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.121 2 DEBUG nova.network.os_vif_util [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:0b:e9,bridge_name='br-int',has_traffic_filtering=True,id=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b,network=Network(a9053617-1148-4139-a949-8321e760481f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac9cc3d-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.122 2 DEBUG nova.objects.instance [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'pci_devices' on Instance uuid 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.145 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <uuid>3aa55e8a-0c2d-4f7b-aac0-c393e35ec679</uuid>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <name>instance-00000002</name>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkBasicOps-server-1247597446</nova:name>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:10:13</nova:creationTime>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:port uuid="6ac9cc3d-5039-41fa-a966-ec61d9e9c38b">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="serial">3aa55e8a-0c2d-4f7b-aac0-c393e35ec679</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="uuid">3aa55e8a-0c2d-4f7b-aac0-c393e35ec679</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.config"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:5d:0b:e9"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <target dev="tap6ac9cc3d-50"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/console.log" append="off"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:10:13 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:10:13 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.147 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Preparing to wait for external event network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.147 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.147 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.148 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.149 2 DEBUG nova.virt.libvirt.vif [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1247597446',display_name='tempest-TestNetworkBasicOps-server-1247597446',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1247597446',id=2,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEYMbYS6FykH8XwBy+xJhBj+DroVNsweQH8/yrZNy2DdnGZ8U7ITpQhcHiv47cPc+C9zUx61bpZMf9xi1jJuzMTLouiDNZyx+sOCB1Md+ZKzM9kBZzk7412n9ZJH/ZBT6Q==',key_name='tempest-TestNetworkBasicOps-1328202832',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-9t6cko4x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:10:05Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=3aa55e8a-0c2d-4f7b-aac0-c393e35ec679,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.149 2 DEBUG nova.network.os_vif_util [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.150 2 DEBUG nova.network.os_vif_util [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5d:0b:e9,bridge_name='br-int',has_traffic_filtering=True,id=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b,network=Network(a9053617-1148-4139-a949-8321e760481f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac9cc3d-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.150 2 DEBUG os_vif [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:0b:e9,bridge_name='br-int',has_traffic_filtering=True,id=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b,network=Network(a9053617-1148-4139-a949-8321e760481f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac9cc3d-50') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.191 2 DEBUG ovsdbapp.backend.ovs_idl [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.192 2 DEBUG ovsdbapp.backend.ovs_idl [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.192 2 DEBUG ovsdbapp.backend.ovs_idl [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [POLLOUT] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.195 2 DEBUG nova.network.neutron [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updating instance_info_cache with network_info: [{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.197 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.215 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Releasing lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.215 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Instance network_info: |[{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.216 2 DEBUG oslo_concurrency.lockutils [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.216 2 DEBUG nova.network.neutron [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Refreshing network info cache for port 9283f59d-4eb5-4e2a-876d-b078582f6dec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.218 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Start _get_guest_xml network_info=[{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.219 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.219 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.219 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.221 2 INFO oslo.privsep.daemon [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpie6dvxfs/privsep.sock']#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.247 2 WARNING nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.251 2 DEBUG nova.virt.libvirt.host [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.252 2 DEBUG nova.virt.libvirt.host [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.255 2 DEBUG nova.virt.libvirt.host [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.255 2 DEBUG nova.virt.libvirt.host [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.256 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.256 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.257 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.257 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.257 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.257 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.258 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.258 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.258 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.258 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.259 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.259 2 DEBUG nova.virt.hardware [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.262 2 DEBUG nova.virt.libvirt.vif [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1391753478',display_name='tempest-TestNetworkAdvancedServerOps-server-1391753478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1391753478',id=3,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHGp9K/m1XQlBJloQMlWOiYAkMHRg/+YyV7EIFeU64B1nJDtz1wGfsQsDxfqhOEvcl/IBS6gweH/4Fue49rFzrh66+jFDwTRyWcSgsUsGaMU3Uma/s2qqLF3+L5vxqg9xw==',key_name='tempest-TestNetworkAdvancedServerOps-1360006664',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-6hld8jn4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:10:05Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=31cd065b-2fe3-418f-869b-a5ac7f4405f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.263 2 DEBUG nova.network.os_vif_util [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.263 2 DEBUG nova.network.os_vif_util [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.264 2 DEBUG nova.objects.instance [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'pci_devices' on Instance uuid 31cd065b-2fe3-418f-869b-a5ac7f4405f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.275 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <uuid>31cd065b-2fe3-418f-869b-a5ac7f4405f8</uuid>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <name>instance-00000003</name>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1391753478</nova:name>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:10:13</nova:creationTime>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:user uuid="db22b0e0f6594362af24484ba9b01936">tempest-TestNetworkAdvancedServerOps-585003851-project-member</nova:user>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:project uuid="8a545a398e2e433bbe3f3dfa2ec4ebcb">tempest-TestNetworkAdvancedServerOps-585003851</nova:project>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        <nova:port uuid="9283f59d-4eb5-4e2a-876d-b078582f6dec">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="serial">31cd065b-2fe3-418f-869b-a5ac7f4405f8</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="uuid">31cd065b-2fe3-418f-869b-a5ac7f4405f8</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.config"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:6e:bc:c6"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <target dev="tap9283f59d-4e"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/console.log" append="off"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:10:13 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:10:13 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:10:13 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:10:13 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.276 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Preparing to wait for external event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.276 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.277 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.277 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.278 2 DEBUG nova.virt.libvirt.vif [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1391753478',display_name='tempest-TestNetworkAdvancedServerOps-server-1391753478',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1391753478',id=3,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHGp9K/m1XQlBJloQMlWOiYAkMHRg/+YyV7EIFeU64B1nJDtz1wGfsQsDxfqhOEvcl/IBS6gweH/4Fue49rFzrh66+jFDwTRyWcSgsUsGaMU3Uma/s2qqLF3+L5vxqg9xw==',key_name='tempest-TestNetworkAdvancedServerOps-1360006664',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-6hld8jn4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:10:05Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=31cd065b-2fe3-418f-869b-a5ac7f4405f8,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.278 2 DEBUG nova.network.os_vif_util [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.279 2 DEBUG nova.network.os_vif_util [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.279 2 DEBUG os_vif [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.305 2 DEBUG nova.compute.manager [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-changed-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.306 2 DEBUG nova.compute.manager [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Refreshing instance network info cache due to event network-changed-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.306 2 DEBUG oslo_concurrency.lockutils [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.306 2 DEBUG oslo_concurrency.lockutils [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.307 2 DEBUG nova.network.neutron [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Refreshing network info cache for port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.920 2 INFO oslo.privsep.daemon [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.793 92 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.800 92 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.803 92 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.804 92 INFO oslo.privsep.daemon [-] privsep daemon running as pid 92#033[00m
Oct  7 16:10:13 np0005474864 nova_compute[192593]: 2025-10-07 20:10:13.927 2 WARNING oslo_privsep.priv_context [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] privsep daemon already running#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.264 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ac9cc3d-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.265 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6ac9cc3d-50, col_values=(('external_ids', {'iface-id': '6ac9cc3d-5039-41fa-a966-ec61d9e9c38b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5d:0b:e9', 'vm-uuid': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:14 np0005474864 NetworkManager[51631]: <info>  [1759867814.2707] manager: (tap6ac9cc3d-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/23)
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.280 2 INFO os_vif [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5d:0b:e9,bridge_name='br-int',has_traffic_filtering=True,id=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b,network=Network(a9053617-1148-4139-a949-8321e760481f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac9cc3d-50')#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.282 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.283 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9283f59d-4e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.284 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9283f59d-4e, col_values=(('external_ids', {'iface-id': '9283f59d-4eb5-4e2a-876d-b078582f6dec', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6e:bc:c6', 'vm-uuid': '31cd065b-2fe3-418f-869b-a5ac7f4405f8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:14 np0005474864 NetworkManager[51631]: <info>  [1759867814.2875] manager: (tap9283f59d-4e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/24)
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.299 2 INFO os_vif [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e')#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.350 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.351 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.351 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No VIF found with MAC fa:16:3e:5d:0b:e9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.352 2 INFO nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Using config drive#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.380 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.381 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.381 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No VIF found with MAC fa:16:3e:6e:bc:c6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:10:14 np0005474864 nova_compute[192593]: 2025-10-07 20:10:14.382 2 INFO nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Using config drive#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.115 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.115 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.116 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.116 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.303 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.389 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.390 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.415 2 INFO nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Creating config drive at /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.config#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.420 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mjfd0vq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.446 2 INFO nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Creating config drive at /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.config#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.450 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_9w6g1is execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.479 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.481 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Periodic task is updating the host stat, it is trying to get disk instance-00000002, but disk file was removed by concurrent operations such as resize.: FileNotFoundError: [Errno 2] No such file or directory: '/var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.config'#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.486 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.561 2 DEBUG oslo_concurrency.processutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1mjfd0vq" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.589 2 DEBUG oslo_concurrency.processutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_9w6g1is" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.600 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.602 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.685 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:10:15 np0005474864 kernel: tun: Universal TUN/TAP device driver, 1.6
Oct  7 16:10:15 np0005474864 kernel: tap6ac9cc3d-50: entered promiscuous mode
Oct  7 16:10:15 np0005474864 kernel: tap9283f59d-4e: entered promiscuous mode
Oct  7 16:10:15 np0005474864 NetworkManager[51631]: <info>  [1759867815.7016] manager: (tap6ac9cc3d-50): new Tun device (/org/freedesktop/NetworkManager/Devices/25)
Oct  7 16:10:15 np0005474864 NetworkManager[51631]: <info>  [1759867815.7037] manager: (tap9283f59d-4e): new Tun device (/org/freedesktop/NetworkManager/Devices/26)
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00027|binding|INFO|Claiming lport 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b for this chassis.
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00028|binding|INFO|6ac9cc3d-5039-41fa-a966-ec61d9e9c38b: Claiming fa:16:3e:5d:0b:e9 10.100.0.11
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00029|binding|INFO|Claiming lport 9283f59d-4eb5-4e2a-876d-b078582f6dec for this chassis.
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00030|binding|INFO|9283f59d-4eb5-4e2a-876d-b078582f6dec: Claiming fa:16:3e:6e:bc:c6 10.100.0.12
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:15.766 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:bc:c6 10.100.0.12'], port_security=['fa:16:3e:6e:bc:c6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c6f15ee-a1fe-4807-8291-599e41409640', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'c698aec5-5b02-413c-9427-198232253cc4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63dd33cd-4919-4f9e-b01e-a4cd30047532, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=9283f59d-4eb5-4e2a-876d-b078582f6dec) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:10:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:15.768 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:0b:e9 10.100.0.11'], port_security=['fa:16:3e:5d:0b:e9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a9053617-1148-4139-a949-8321e760481f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '2', 'neutron:security_group_ids': '827ff091-5676-4b1d-8d1c-d3af6f7c6fff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a890b16f-fa51-4e24-8f2d-bf0ff593911f, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:10:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:15.769 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 9283f59d-4eb5-4e2a-876d-b078582f6dec in datapath 3c6f15ee-a1fe-4807-8291-599e41409640 bound to our chassis#033[00m
Oct  7 16:10:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:15.772 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3c6f15ee-a1fe-4807-8291-599e41409640#033[00m
Oct  7 16:10:15 np0005474864 systemd-udevd[220408]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:10:15 np0005474864 systemd-udevd[220406]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:10:15 np0005474864 NetworkManager[51631]: <info>  [1759867815.8109] device (tap9283f59d-4e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:10:15 np0005474864 NetworkManager[51631]: <info>  [1759867815.8122] device (tap9283f59d-4e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:10:15 np0005474864 NetworkManager[51631]: <info>  [1759867815.8165] device (tap6ac9cc3d-50): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:10:15 np0005474864 NetworkManager[51631]: <info>  [1759867815.8175] device (tap6ac9cc3d-50): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:10:15 np0005474864 systemd-machined[152586]: New machine qemu-1-instance-00000003.
Oct  7 16:10:15 np0005474864 systemd[1]: Started Virtual Machine qemu-1-instance-00000003.
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:15 np0005474864 systemd-machined[152586]: New machine qemu-2-instance-00000002.
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00031|binding|INFO|Setting lport 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b ovn-installed in OVS
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00032|binding|INFO|Setting lport 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b up in Southbound
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00033|binding|INFO|Setting lport 9283f59d-4eb5-4e2a-876d-b078582f6dec ovn-installed in OVS
Oct  7 16:10:15 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:15Z|00034|binding|INFO|Setting lport 9283f59d-4eb5-4e2a-876d-b078582f6dec up in Southbound
Oct  7 16:10:15 np0005474864 nova_compute[192593]: 2025-10-07 20:10:15.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:15 np0005474864 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.000 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.002 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5863MB free_disk=73.46758270263672GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.002 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.002 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.076 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.077 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 31cd065b-2fe3-418f-869b-a5ac7f4405f8 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.077 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.077 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.181 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.181 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.182 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.182 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.220 2 ERROR nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [req-a7b6f101-fe68-464c-8dc3-e39047e768d5] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-a7b6f101-fe68-464c-8dc3-e39047e768d5"}]}#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.248 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[89273874-b5f1-45cb-a05e-ff502c0a92da]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.250 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3c6f15ee-a1 in ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.252 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3c6f15ee-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.252 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d22650ac-9e55-48cf-8129-b44cc6f1060b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.252 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing inventories for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.253 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5d67181f-da31-4b9b-9011-b58717f15914]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.273 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating ProviderTree inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.274 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.294 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[6ac4c8f5-b1e5-4a1a-a67a-ce898780764b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.295 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing aggregate associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.322 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing trait associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.328 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c0548038-3fbb-4438-92cc-bd3520039e31]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.330 103685 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmp1hkmg_i7/privsep.sock']#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.379 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.449 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updated inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.449 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.450 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.482 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.482 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.480s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.716 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867816.7160213, 31cd065b-2fe3-418f-869b-a5ac7f4405f8 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.717 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] VM Started (Lifecycle Event)#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.745 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.750 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867816.7161636, 31cd065b-2fe3-418f-869b-a5ac7f4405f8 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.750 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.770 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.775 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:10:16 np0005474864 nova_compute[192593]: 2025-10-07 20:10:16.823 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:17.067 103685 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:17.068 103685 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp1hkmg_i7/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.942 220454 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.950 220454 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.953 220454 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:16.954 220454 INFO oslo.privsep.daemon [-] privsep daemon running as pid 220454#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:17.072 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc9b2e2-76f4-453a-8251-378d08068db3]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.131 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867817.1312997, 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.132 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] VM Started (Lifecycle Event)#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.170 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.175 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867817.1323574, 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.175 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.214 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.219 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.244 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.343 2 DEBUG nova.network.neutron [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updated VIF entry in instance network info cache for port 9283f59d-4eb5-4e2a-876d-b078582f6dec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.344 2 DEBUG nova.network.neutron [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updating instance_info_cache with network_info: [{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.366 2 DEBUG oslo_concurrency.lockutils [req-5d0e19b7-fd87-4056-90ed-20a4772e9a59 req-0fe49385-61e8-4a91-8da3-e4cda845ac80 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:10:17 np0005474864 nova_compute[192593]: 2025-10-07 20:10:17.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:17.552 220454 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:17.552 220454 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:17.552 220454 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.058 2 DEBUG nova.network.neutron [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updated VIF entry in instance network info cache for port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.058 2 DEBUG nova.network.neutron [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updating instance_info_cache with network_info: [{"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.081 2 DEBUG oslo_concurrency.lockutils [req-a870d264-8100-45c0-bdc0-ce2585b75701 req-cc78c02a-4ed1-4c8e-af6f-c92a2a90f822 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.084 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[987b1e74-9805-4dd4-b604-50b4cdc40ee4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.093 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8c96fdec-5f24-4606-9a6d-9776895ad84c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 NetworkManager[51631]: <info>  [1759867818.0952] manager: (tap3c6f15ee-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.122 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[a557627b-758e-455c-a8a2-70cf20b6e264]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.125 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[84948201-e94d-4a2c-b5e9-6db382e2ad00]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 NetworkManager[51631]: <info>  [1759867818.1440] device (tap3c6f15ee-a0): carrier: link connected
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.147 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[8019feb5-a801-4ef7-8c21-0986dc239b08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.169 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cf52d013-4065-4b5c-a0b2-7f164fe7c5f3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c6f15ee-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:ce:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345403, 'reachable_time': 17691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220476, 'error': None, 'target': 'ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.188 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cf61a3f7-e8f2-4ea4-a967-4a86b76df688]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:cec1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345403, 'tstamp': 345403}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220478, 'error': None, 'target': 'ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.209 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d08139d5-1ba5-47b3-af49-686b3ff94879]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3c6f15ee-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:ce:c1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 3, 'rx_bytes': 110, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 3, 'rx_bytes': 110, 'tx_bytes': 266, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 15], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345403, 'reachable_time': 17691, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 3, 'outoctets': 224, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 3, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 224, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 3, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220479, 'error': None, 'target': 'ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.250 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3c84329a-88b5-45f1-bd10-c2487e5ed071]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.342 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f3e96cef-628a-4ea5-ac77-64444ad750b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.344 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c6f15ee-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.345 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.346 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3c6f15ee-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:18 np0005474864 kernel: tap3c6f15ee-a0: entered promiscuous mode
Oct  7 16:10:18 np0005474864 NetworkManager[51631]: <info>  [1759867818.3872] manager: (tap3c6f15ee-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.389 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.392 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3c6f15ee-a0, col_values=(('external_ids', {'iface-id': 'c5e7e333-20ed-4e0b-a782-2e5e61b11c88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:18 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:18Z|00035|binding|INFO|Releasing lport c5e7e333-20ed-4e0b-a782-2e5e61b11c88 from this chassis (sb_readonly=0)
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.419 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3c6f15ee-a1fe-4807-8291-599e41409640.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3c6f15ee-a1fe-4807-8291-599e41409640.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.420 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[309b74df-33ed-4e64-894c-289c250fd690]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.422 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-3c6f15ee-a1fe-4807-8291-599e41409640
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/3c6f15ee-a1fe-4807-8291-599e41409640.pid.haproxy
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 3c6f15ee-a1fe-4807-8291-599e41409640
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:10:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:18.424 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640', 'env', 'PROCESS_TAG=haproxy-3c6f15ee-a1fe-4807-8291-599e41409640', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3c6f15ee-a1fe-4807-8291-599e41409640.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.478 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.479 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:18 np0005474864 nova_compute[192593]: 2025-10-07 20:10:18.480 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:18 np0005474864 podman[220512]: 2025-10-07 20:10:18.850046868 +0000 UTC m=+0.056619187 container create c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  7 16:10:18 np0005474864 systemd[1]: Started libpod-conmon-c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5.scope.
Oct  7 16:10:18 np0005474864 podman[220512]: 2025-10-07 20:10:18.815732102 +0000 UTC m=+0.022304381 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:10:18 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:10:18 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/09d66dc9225f4ae99f35730a2ed566c60cf3527cb5f5f9599832757c233eb4a7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:10:18 np0005474864 podman[220512]: 2025-10-07 20:10:18.966371049 +0000 UTC m=+0.172943348 container init c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true)
Oct  7 16:10:18 np0005474864 podman[220512]: 2025-10-07 20:10:18.971267489 +0000 UTC m=+0.177839768 container start c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  7 16:10:18 np0005474864 neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640[220527]: [NOTICE]   (220531) : New worker (220533) forked
Oct  7 16:10:18 np0005474864 neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640[220527]: [NOTICE]   (220531) : Loading success.
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.027 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b in datapath a9053617-1148-4139-a949-8321e760481f unbound from our chassis#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.031 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a9053617-1148-4139-a949-8321e760481f#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.043 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e72ada1f-0383-4609-921f-8d9f6c909c57]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.044 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa9053617-11 in ovnmeta-a9053617-1148-4139-a949-8321e760481f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.046 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa9053617-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.046 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5cf274b5-cfa6-4ac5-a346-fb9387b80e22]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.047 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[238c7108-e98b-4c19-91e9-9ed2069a0f0e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.075 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[72375a32-08a3-4637-b5bb-0de5623c328f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.104 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ed4c5d6c-8d00-4075-81f4-75b6736b58e4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.143 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[43d304f1-5e51-4f72-8278-9b2ca169eb43]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 NetworkManager[51631]: <info>  [1759867819.1524] manager: (tapa9053617-10): new Veth device (/org/freedesktop/NetworkManager/Devices/29)
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.155 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7dfaa938-ff89-48d9-9733-29b6b9bcfacf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 systemd-udevd[220550]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.204 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[21ce22f8-9d68-4b16-87bd-136c787f499b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.209 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[47980859-ac35-4908-b0e4-dd15810e4736]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 NetworkManager[51631]: <info>  [1759867819.2461] device (tapa9053617-10): carrier: link connected
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.254 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[15ab6cc9-df0a-492a-bb05-21b8d549479e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.281 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a80cf19f-c0b5-4842-8995-5a845c4a0c27]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa9053617-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:f3:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345513, 'reachable_time': 42590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220569, 'error': None, 'target': 'ovnmeta-a9053617-1148-4139-a949-8321e760481f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 nova_compute[192593]: 2025-10-07 20:10:19.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.303 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[02dbb28f-fa1a-4597-be1d-22355a7894fb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fef8:f37e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 345513, 'tstamp': 345513}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220570, 'error': None, 'target': 'ovnmeta-a9053617-1148-4139-a949-8321e760481f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.327 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3b823caf-bf99-4090-b4ab-68d7175ad9bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa9053617-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:f8:f3:7e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 16], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345513, 'reachable_time': 42590, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220571, 'error': None, 'target': 'ovnmeta-a9053617-1148-4139-a949-8321e760481f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.373 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d3c92ae2-69da-458e-98c0-674bae5616c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.459 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[16e260bb-1a7d-44b9-b7d3-30ed3f4d59b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.461 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9053617-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.462 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.463 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa9053617-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:19 np0005474864 nova_compute[192593]: 2025-10-07 20:10:19.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:19 np0005474864 kernel: tapa9053617-10: entered promiscuous mode
Oct  7 16:10:19 np0005474864 NetworkManager[51631]: <info>  [1759867819.5140] manager: (tapa9053617-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/30)
Oct  7 16:10:19 np0005474864 nova_compute[192593]: 2025-10-07 20:10:19.515 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.517 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa9053617-10, col_values=(('external_ids', {'iface-id': '7ace7da2-42dc-433a-8b4d-8286301cfa0e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:19 np0005474864 nova_compute[192593]: 2025-10-07 20:10:19.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:19 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:19Z|00036|binding|INFO|Releasing lport 7ace7da2-42dc-433a-8b4d-8286301cfa0e from this chassis (sb_readonly=0)
Oct  7 16:10:19 np0005474864 nova_compute[192593]: 2025-10-07 20:10:19.546 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.547 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a9053617-1148-4139-a949-8321e760481f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a9053617-1148-4139-a949-8321e760481f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.548 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[912f31b4-f4d9-4987-8c4f-0ceefb85e363]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.549 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-a9053617-1148-4139-a949-8321e760481f
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/a9053617-1148-4139-a949-8321e760481f.pid.haproxy
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID a9053617-1148-4139-a949-8321e760481f
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:10:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:19.551 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a9053617-1148-4139-a949-8321e760481f', 'env', 'PROCESS_TAG=haproxy-a9053617-1148-4139-a949-8321e760481f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a9053617-1148-4139-a949-8321e760481f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:10:20 np0005474864 podman[220604]: 2025-10-07 20:10:20.013392709 +0000 UTC m=+0.077731294 container create 7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  7 16:10:20 np0005474864 systemd[1]: Started libpod-conmon-7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa.scope.
Oct  7 16:10:20 np0005474864 podman[220604]: 2025-10-07 20:10:19.978749894 +0000 UTC m=+0.043088539 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:10:20 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:10:20 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0915cbe3cc260c42bf96e2db68778e8e1d546d3b4e68292d6f4238a617767750/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:10:20 np0005474864 podman[220604]: 2025-10-07 20:10:20.113343489 +0000 UTC m=+0.177682134 container init 7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:10:20 np0005474864 podman[220604]: 2025-10-07 20:10:20.122747609 +0000 UTC m=+0.187086204 container start 7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:10:20 np0005474864 neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f[220620]: [NOTICE]   (220624) : New worker (220626) forked
Oct  7 16:10:20 np0005474864 neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f[220620]: [NOTICE]   (220624) : Loading success.
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.120 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.120 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.121 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.121 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.122 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:21 np0005474864 nova_compute[192593]: 2025-10-07 20:10:21.122 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:10:22 np0005474864 nova_compute[192593]: 2025-10-07 20:10:22.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:10:22 np0005474864 podman[220635]: 2025-10-07 20:10:22.374002947 +0000 UTC m=+0.058586895 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:10:22 np0005474864 podman[220636]: 2025-10-07 20:10:22.393512037 +0000 UTC m=+0.071489075 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, architecture=x86_64, container_name=openstack_network_exporter, distribution-scope=public, io.openshift.expose-services=, maintainer=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, vcs-type=git, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6)
Oct  7 16:10:22 np0005474864 nova_compute[192593]: 2025-10-07 20:10:22.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:24 np0005474864 nova_compute[192593]: 2025-10-07 20:10:24.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:27 np0005474864 nova_compute[192593]: 2025-10-07 20:10:27.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:28 np0005474864 podman[220682]: 2025-10-07 20:10:28.377792278 +0000 UTC m=+0.078082344 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  7 16:10:28 np0005474864 podman[220684]: 2025-10-07 20:10:28.402534978 +0000 UTC m=+0.081616125 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:10:28 np0005474864 podman[220683]: 2025-10-07 20:10:28.430684707 +0000 UTC m=+0.115814108 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS)
Oct  7 16:10:29 np0005474864 nova_compute[192593]: 2025-10-07 20:10:29.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:31.800 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ba12968436a56a40594bfedc977eac8eb05337ae1a36d2f0e9e3ab753414662a" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  7 16:10:32 np0005474864 podman[220746]: 2025-10-07 20:10:32.382607617 +0000 UTC m=+0.072862894 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  7 16:10:32 np0005474864 nova_compute[192593]: 2025-10-07 20:10:32.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:34 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:34.296 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Tue, 07 Oct 2025 20:10:31 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-34e515c6-282d-4240-b3f2-5d389074e0df x-openstack-request-id: req-34e515c6-282d-4240-b3f2-5d389074e0df _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  7 16:10:34 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:34.296 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "3fec056a-1226-48ad-a02c-e4fe097a9363", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/3fec056a-1226-48ad-a02c-e4fe097a9363"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/3fec056a-1226-48ad-a02c-e4fe097a9363"}]}, {"id": "6f818030-fd65-46a5-919a-d1c6799f41ca", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/6f818030-fd65-46a5-919a-d1c6799f41ca"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/6f818030-fd65-46a5-919a-d1c6799f41ca"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  7 16:10:34 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:34.296 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-34e515c6-282d-4240-b3f2-5d389074e0df request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  7 16:10:34 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:34.300 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/3fec056a-1226-48ad-a02c-e4fe097a9363 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}ba12968436a56a40594bfedc977eac8eb05337ae1a36d2f0e9e3ab753414662a" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Oct  7 16:10:34 np0005474864 nova_compute[192593]: 2025-10-07 20:10:34.396 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.705 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Tue, 07 Oct 2025 20:10:34 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-9083ae09-b238-4674-a578-553ee018656e x-openstack-request-id: req-9083ae09-b238-4674-a578-553ee018656e _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.706 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "3fec056a-1226-48ad-a02c-e4fe097a9363", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/3fec056a-1226-48ad-a02c-e4fe097a9363"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/3fec056a-1226-48ad-a02c-e4fe097a9363"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.706 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/3fec056a-1226-48ad-a02c-e4fe097a9363 used request id req-9083ae09-b238-4674-a578-553ee018656e request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.707 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'name': 'tempest-TestNetworkBasicOps-server-1247597446', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '57491b24c6b2419c842483a87c8b4d42', 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'hostId': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.711 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000003', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'hostId': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.711 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.739 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.740 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.770 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.772 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c6720883-beb1-440e-99e0-ea8b19059d9f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.711883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af83a096-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '9565a8fd1db3f50814622e409c75408c0ea6c370f4f38734b1ecf7227c7c124d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.711883', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af83b748-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': 'ec1f87dc8228c3af6d81912ab4ed65ecb6c5264b7b1498395d9251a2043d6650'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.711883', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af886c16-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '83ee76f88b2660ffe00ccf7871dbd4d0922847ebb40a346aee971726d060eed4'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.711883', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af888e9e-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '4e8f1510c8dc3aa3ef9f8af6b706f92d513e868f5b57481f494d8cead3ee68cc'}]}, 'timestamp': '2025-10-07 20:10:35.772837', '_unique_id': 'b9546130921b44d2980959ff9c663115'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.785 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.791 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.797 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 / tap6ac9cc3d-50 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.798 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.802 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 31cd065b-2fe3-418f-869b-a5ac7f4405f8 / tap9283f59d-4e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.802 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7cb20ee6-04b5-49ff-b78c-fec4aa6e256e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.791778', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'af8c8e72-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': '1bffe97dc1e792df82fb0497cb37bfb15f3740aaf907dcc4b82c7d4573e56184'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.791778', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'af8d3eee-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': 'dcf73432e9fcdc02b9c3159b0757680df8681e4cfa5df4cde69a9ecf14c2c367'}]}, 'timestamp': '2025-10-07 20:10:35.803572', '_unique_id': '389b49e609d04678a8824edb40e840e7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.805 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.806 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.807 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.807 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac8c365d-7969-4619-856a-9e4236076362', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.807155', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'af8de556-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': 'c70ae7692af16ebeec5e5b7240d5284d75c5dbe92886aa96c4b4b224f4456ac7'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.807155', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'af8df9c4-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': 'd95c494c603c1892611213846a7046c0541a24effd2665bfbb95dddc2035b33d'}]}, 'timestamp': '2025-10-07 20:10:35.808308', '_unique_id': 'f46e95b512f04bc288cab06e7e1fab5c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.809 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.810 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.830 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.831 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.848 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.849 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82e65bf2-8799-4c4f-be30-b77b165527cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.810974', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af91894a-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.760992842, 'message_signature': '2de00f16a5fa1898a6b7759ad4969c0fa12a13e0fdd5bf4ce8f533d18a8d1da9'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.810974', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af919fe8-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.760992842, 'message_signature': '44b49f19bb3aa0e210d065ba61d388cf58230f204a6e3ccc9c410b1f50ae683d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.810974', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af944950-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.781974134, 'message_signature': 'a5cc0016ee5f147328b2727211efd6473d7cdbd637ea68e804536d1cf842d20e'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.810974', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af946084-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.781974134, 'message_signature': '3bc1e09819b63e72f3da99d6ae5b41d7d01a4109885088323bc021a1576ac212'}]}, 'timestamp': '2025-10-07 20:10:35.850204', '_unique_id': '2205c33ab7f94ef4a1fc7009314aa243'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.852 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.853 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.882 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.882 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679: ceilometer.compute.pollsters.NoVolumeException
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.911 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.911 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 31cd065b-2fe3-418f-869b-a5ac7f4405f8: ceilometer.compute.pollsters.NoVolumeException
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.911 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.911 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.911 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.912 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.912 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5ee0477-53d8-4b76-a93d-f2e32739d86b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.911536', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af9dcf52-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': 'a3324d1b8d4518abebf506b5cda5aceaaddfffb191ca3694cad9e0499518c8a5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.911536', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af9ddd4e-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '73f454b54aea60ed7936c4b80c082cc4008ac082d6957d544692ba18a7d1b072'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.911536', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af9deaa0-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '1c24961aff3bd62e2cecdde471db06103954380a94e13b04a846120f9506450e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.911536', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af9df540-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '65c521d1687cf1cca4bbb5fc38f06500ab0c242c9a1113a0898c1037336d6bdd'}]}, 'timestamp': '2025-10-07 20:10:35.912882', '_unique_id': 'eb8260cec47f4a67aa7a7ba154073ea6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.914 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.915 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.915 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ffed8083-87f3-43a3-a9a2-28b2f32c3152', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.915056', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'af9e590e-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': '6d8b5ab5ccdb3cd2f8fd715632230db219f8c3202eefbe91cc2d84e2e8befdff'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.915056', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'af9e64c6-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': '8004dee64d2eb31783ca2b1c2e02cbc18796af70c875be011558432f6f03048b'}]}, 'timestamp': '2025-10-07 20:10:35.915738', '_unique_id': 'be9ed0f8744d46329c975464c4d3e6eb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.916 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.917 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.917 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.917 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '32c45e99-493f-4d65-96b6-ef0288e8315a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.917277', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'af9ead14-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': '99881e4704cd8021bca1bdcae83d59418f75088cf207fda1508ebf48b7de9872'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.917277', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'af9eb868-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': '5ba0aa4cbcabc06d2fa19896211923cc8ddf483564ef794feb5bba27dbfd079e'}]}, 'timestamp': '2025-10-07 20:10:35.917872', '_unique_id': '2297e5e5291747249d84ab95a003b250'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.918 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.919 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.919 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.919 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0579ce90-5826-4e66-ae73-9c2ec87d4d42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.919395', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'af9eff80-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': '49bfdd0bbdff674b4dd14c82615670cc5f3293c299aaf5e09f1b03d4bde2f9c0'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.919395', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'af9f0aac-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': 'c605741ccd864d145e9578993ad5d503ee43ee8807e39ffd3cdcada7d366ff99'}]}, 'timestamp': '2025-10-07 20:10:35.919977', '_unique_id': 'dcb8e609725344448078a83d14a7103e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.920 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.921 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.921 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.921 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '559b688d-01e9-4452-a97c-67785d398014', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.921502', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'af9f51d8-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': 'ea69fbe3a5c3552cd45028c47e9addfec9284e09d6731bba510b2aab5b92385a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.921502', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'af9f5cf0-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': '100c9054cba457d25f822147f3d2ac9a74aca6ea71420c12740b194b57d411fb'}]}, 'timestamp': '2025-10-07 20:10:35.922081', '_unique_id': '7e5e7c3de5224a159441735427b3a326'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.922 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.923 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.923 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.923 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>]
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.924 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.924 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.924 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.924 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.925 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59c3414a-46fb-46f2-afa3-4e560af785b1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.924279', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af9fc00a-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': 'a5b2df7b12b03cb7fb8177f89eb006d99eba55be8d45e7e8e6c0aa568757e01a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.924279', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af9fcad2-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '544f2210a061e08e25b08ddb776542da45a8d7b70369c3402e6daf51f98e1c2e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.924279', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'af9fd50e-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '009fcac3459a7256c97ac485cabdb5db1e9f02ad2564a082e85063371341da94'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.924279', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'af9fe044-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': 'affc59bfbc334cedb2fd2b0da83f0a12313230d8fc1e5f01388bb671f1a51ee0'}]}, 'timestamp': '2025-10-07 20:10:35.925460', '_unique_id': '4bc3d6c00e97491c93cf2736e0b3ff13'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.926 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.927 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.927 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '17c7c4a2-9f5b-423f-9fad-bc3ad162c5d0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.927008', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'afa028d8-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': '3bea3a34ec0719cb809f33ee3c307c300414ebaf90f8cddd611b5ccb04b97d05'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.927008', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'afa03756-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': '18417ebd0925aa8517852f83bdd2143850f87b1acb14800d518fdab656744130'}]}, 'timestamp': '2025-10-07 20:10:35.927689', '_unique_id': '4c091d6c21c64c04be65db6d5b72d01d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.928 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.929 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.930 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.930 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.931 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.931 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '919e8036-d4e1-4ac8-9985-75af14afd4f2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.930105', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa0a696-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': 'b2ff3dfdea7b49a4180889fee317bb5ba6290100f0f29ac1442293648237821d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.930105', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa0b74e-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '47d01371f2c9592c5683f8541d84b6f68b7a27dc21792671f0b2f4eedf47a3bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.930105', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa0c73e-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': 'd94349fbf1bd3fa3e1f72f5949590b9491327291c8a8df4380856a4ba148ff93'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.930105', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa0d97c-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '4d15f04d4df3bcca77e8df2805f77cda112239e8e7d5fcad862a0a125acc09e3'}]}, 'timestamp': '2025-10-07 20:10:35.931884', '_unique_id': '36b5c79fa1e7410fbc2b9909b104964f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.932 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.934 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.934 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.934 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.935 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.935 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '44a95799-3a64-4f36-b917-72a69b20d7b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.934392', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa14cc2-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '8565afa7681b468b5931ce8849e39cb9490fcd8eaa8cf164855e607c8a586e63'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.934392', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa15d16-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '22ce1bbe611918d35839802d067ca3bad751d02ca7464c445bdb7396abf6e6c6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.934392', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa16fae-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '40b70c236aacf54da6d45cacaa966287938e4454da51b27c20da6d22bd34f193'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.934392', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa17f8a-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': 'af7be72dcfc271dbf22c3d304fa3d18087e07d3d6915d96fc64ec390e447c66f'}]}, 'timestamp': '2025-10-07 20:10:35.936133', '_unique_id': 'a81891a882c64db1abb65dc3556a0ba9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.937 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.938 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>]
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.938 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.938 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.938 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.938 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.939 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a3bd6321-6c12-43fa-85bb-c69e11f49bd2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.938308', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa1e204-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.760992842, 'message_signature': '711c890de091629bbcec0141c65c55b45398c8556f96993d66e32c8e98201542'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.938308', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa1ecb8-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.760992842, 'message_signature': '12949b27803365ea1d89a65d4cf27c665be49ea2b16a38e06274bb36bd035075'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.938308', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa1f6c2-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.781974134, 'message_signature': '258c42c36cd6affb6b2ba1f222aa277919eac43a8f8002fe9e618313ba4cadd0'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.938308', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa20086-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.781974134, 'message_signature': 'c98c032257dbc64151a11376a1bc5cf55599a0b7eb915ed95cdbc86b26602bab'}]}, 'timestamp': '2025-10-07 20:10:35.939413', '_unique_id': '70b569e3fcce49dfb2d5d1aaed0ef757'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.940 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.941 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e8708cc9-1f41-4dde-b48a-42dd2377207e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'timestamp': '2025-10-07T20:10:35.940944', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'afa24906-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.831820896, 'message_signature': 'e26cce3130906ed74c1e49dd78e4bb788095ab631e347956aaad11d753305fe0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'timestamp': '2025-10-07T20:10:35.940944', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'afa25446-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.860721246, 'message_signature': '58a81e037f4c1cfa2744a8eb16d2e9a98cafc3c31f6603f2ccc8c26b9ede6ded'}]}, 'timestamp': '2025-10-07 20:10:35.941509', '_unique_id': 'e403f0c200f54251b9d71a843a911feb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.942 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.943 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2c9ab461-533a-4e67-8d34-b8ba6a003462', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.942934', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'afa296b8-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': 'f698674aa3c9ece48bbc7e6fd7c6022da2c4feeafec346f3ac3a87fc9bc89008'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.942934', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'afa2a39c-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': 'cf8d18e0d1a3886c683769c0513caca733601fcaed969d6d442de65a53bde888'}]}, 'timestamp': '2025-10-07 20:10:35.943555', '_unique_id': 'c06f80039f10449e9af446bb57636e38'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.944 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.945 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.945 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.945 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.945 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.945 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bff13c7a-5781-41fa-82fa-fc5c9af42b10', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.945119', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa2ecf8-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '3e7c0921c1114ead3a2bf4225d98f97191131e6c59fa121509f7761acd76ddda'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.945119', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa2f766-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.661886305, 'message_signature': '0a658bd61bbd3448f5afe1c537016be884b53139d44ab20d2e4939256b443b0d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.945119', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa3010c-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': 'c8dcc67e2749d8f608fd889fb4118cecc4928ccfefca89bae49898c70db54408'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.945119', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa30ada-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.690702553, 'message_signature': '9134c44858e4559acddf529f0910c8eddafb4dfcd7f994a83feeb534adcd6284'}]}, 'timestamp': '2025-10-07 20:10:35.946296', '_unique_id': '9dbeac462c1948feb923083691c91c1e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.946 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.947 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.947 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.948 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>]
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.948 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.948 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1247597446>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1391753478>]
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.948 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.948 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.948 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '849cc5d6-1bd4-47ae-abe1-9cda70a53d97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.948630', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'afa3752e-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': 'c5b60e717ba662cbe750660da9251c4eee68ff1d6ea24ecd96f446e8f8a393f3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.948630', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'afa38050-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': '7f9ca150a689866285c54b60bea12320f303621ac7eba1a95aa4ba68b6e7218f'}]}, 'timestamp': '2025-10-07 20:10:35.949200', '_unique_id': '1124f51f508446319b0955bd81e02034'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.949 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.950 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.950 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '337805c7-36b3-4a0a-bd33-0018ae68b883', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-00000002-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-tap6ac9cc3d-50', 'timestamp': '2025-10-07T20:10:35.950804', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'tap6ac9cc3d-50', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5d:0b:e9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ac9cc3d-50'}, 'message_id': 'afa3ca38-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.741872392, 'message_signature': '8a4464e219e9b940ec5028d783434e6100522c2dfb39924895106df8733026d1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': 'instance-00000003-31cd065b-2fe3-418f-869b-a5ac7f4405f8-tap9283f59d-4e', 'timestamp': '2025-10-07T20:10:35.950804', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'tap9283f59d-4e', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6e:bc:c6', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9283f59d-4e'}, 'message_id': 'afa3d4f6-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.748853163, 'message_signature': 'ec57a212ed4305d5f23dcad3fa0de657fa2c9f4e7e5a8fa87c7e8f8b8c37a037'}]}, 'timestamp': '2025-10-07 20:10:35.951391', '_unique_id': '57312b40efd94101bfdbbdee7bf04c8a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.951 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.952 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.952 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.953 12 DEBUG ceilometer.compute.pollsters [-] 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.953 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.953 12 DEBUG ceilometer.compute.pollsters [-] 31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6c60349c-a5f0-4a0e-99f1-c9da0e327fcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-vda', 'timestamp': '2025-10-07T20:10:35.952857', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa41a4c-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.760992842, 'message_signature': '69e049b86fa82b26ea359c071d51d77364c040e05152b896a6cee520ed34248f'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-sda', 'timestamp': '2025-10-07T20:10:35.952857', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1247597446', 'name': 'instance-00000002', 'instance_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa4255a-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.760992842, 'message_signature': '10022bb242c33660d12f75d180cea947a7c41f82e57c8f4a461ae449d7196dc8'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-vda', 'timestamp': '2025-10-07T20:10:35.952857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'afa42f82-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.781974134, 'message_signature': '222b747a162620a19c86e25b2f3c6295a8cf79ea039994dd139c066abb2aafce'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_name': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_name': None, 'resource_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8-sda', 'timestamp': '2025-10-07T20:10:35.952857', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1391753478', 'name': 'instance-00000003', 'instance_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'instance_type': 'm1.nano', 'host': '0a789b44dab0b713c8b5767e9587cacc402440c5ec8a798e687eaeed', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'afa4395a-a3b9-11f0-9441-fa163e5cce8e', 'monotonic_time': 3471.781974134, 'message_signature': 'fa20bd8b9557e3a4a2bbf3fd1b3da035289c316a95e7995db2f6056e0246cccf'}]}, 'timestamp': '2025-10-07 20:10:35.953924', '_unique_id': '9eed954372be4aa98dd8f9f22277a020'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:10:35 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:10:35.954 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:10:36 np0005474864 podman[220765]: 2025-10-07 20:10:36.387165999 +0000 UTC m=+0.078004612 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:10:37 np0005474864 nova_compute[192593]: 2025-10-07 20:10:37.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:39 np0005474864 podman[220790]: 2025-10-07 20:10:39.380701693 +0000 UTC m=+0.076470638 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  7 16:10:39 np0005474864 nova_compute[192593]: 2025-10-07 20:10:39.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:40.347 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:10:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:40.350 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.813 2 DEBUG nova.compute.manager [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.813 2 DEBUG oslo_concurrency.lockutils [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.814 2 DEBUG oslo_concurrency.lockutils [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.814 2 DEBUG oslo_concurrency.lockutils [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.814 2 DEBUG nova.compute.manager [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Processing event network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.815 2 DEBUG nova.compute.manager [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.815 2 DEBUG oslo_concurrency.lockutils [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.815 2 DEBUG oslo_concurrency.lockutils [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.816 2 DEBUG oslo_concurrency.lockutils [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.816 2 DEBUG nova.compute.manager [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] No waiting events found dispatching network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.816 2 WARNING nova.compute.manager [req-6271c41a-78bd-4a96-a158-ff8600408e38 req-b0238197-f267-4cef-93cc-7eb8865ba129 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received unexpected event network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.817 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Instance event wait completed in 23 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.822 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.827 2 INFO nova.virt.libvirt.driver [-] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Instance spawned successfully.#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.828 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.837 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867840.8370519, 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.838 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.849 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.850 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.851 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.851 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.852 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.853 2 DEBUG nova.virt.libvirt.driver [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.860 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.864 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.893 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.910 2 INFO nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Took 35.44 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:10:40 np0005474864 nova_compute[192593]: 2025-10-07 20:10:40.914 2 DEBUG nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:41 np0005474864 nova_compute[192593]: 2025-10-07 20:10:41.013 2 INFO nova.compute.manager [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Took 36.00 seconds to build instance.#033[00m
Oct  7 16:10:41 np0005474864 nova_compute[192593]: 2025-10-07 20:10:41.044 2 DEBUG oslo_concurrency.lockutils [None req-cfb7a53e-eb8e-4674-b58a-7bf3ac38400c fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 36.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:42 np0005474864 nova_compute[192593]: 2025-10-07 20:10:42.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:43 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:10:43.354 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.404 2 DEBUG nova.compute.manager [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.405 2 DEBUG oslo_concurrency.lockutils [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.406 2 DEBUG oslo_concurrency.lockutils [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.406 2 DEBUG oslo_concurrency.lockutils [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.406 2 DEBUG nova.compute.manager [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Processing event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.407 2 DEBUG nova.compute.manager [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.407 2 DEBUG oslo_concurrency.lockutils [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.408 2 DEBUG oslo_concurrency.lockutils [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.408 2 DEBUG oslo_concurrency.lockutils [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.409 2 DEBUG nova.compute.manager [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] No waiting events found dispatching network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.409 2 WARNING nova.compute.manager [req-114c856f-d440-4969-a14b-85dee93c4a26 req-29b9c81d-a09b-4d12-b1d8-0ee6305f32aa 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received unexpected event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.410 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Instance event wait completed in 26 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.414 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867843.4140816, 31cd065b-2fe3-418f-869b-a5ac7f4405f8 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.415 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.419 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.424 2 INFO nova.virt.libvirt.driver [-] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Instance spawned successfully.#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.425 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.434 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.447 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.460 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.461 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.463 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.464 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.464 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.465 2 DEBUG nova.virt.libvirt.driver [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.483 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.627 2 INFO nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Took 37.97 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.628 2 DEBUG nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.756 2 INFO nova.compute.manager [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Took 38.70 seconds to build instance.#033[00m
Oct  7 16:10:43 np0005474864 nova_compute[192593]: 2025-10-07 20:10:43.799 2 DEBUG oslo_concurrency.lockutils [None req-6b54dce5-9903-4b1f-87b6-47d463150b41 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 38.870s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:10:44 np0005474864 nova_compute[192593]: 2025-10-07 20:10:44.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:45 np0005474864 nova_compute[192593]: 2025-10-07 20:10:45.850 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8525] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/31)
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8540] device (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8567] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/32)
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8578] device (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8600] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8614] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/34)
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8625] device (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  7 16:10:45 np0005474864 NetworkManager[51631]: <info>  [1759867845.8634] device (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Oct  7 16:10:46 np0005474864 nova_compute[192593]: 2025-10-07 20:10:46.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:46 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:46Z|00037|binding|INFO|Releasing lport c5e7e333-20ed-4e0b-a782-2e5e61b11c88 from this chassis (sb_readonly=0)
Oct  7 16:10:46 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:46Z|00038|binding|INFO|Releasing lport 7ace7da2-42dc-433a-8b4d-8286301cfa0e from this chassis (sb_readonly=0)
Oct  7 16:10:46 np0005474864 nova_compute[192593]: 2025-10-07 20:10:46.134 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:46 np0005474864 nova_compute[192593]: 2025-10-07 20:10:46.839 2 DEBUG nova.compute.manager [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-changed-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:46 np0005474864 nova_compute[192593]: 2025-10-07 20:10:46.840 2 DEBUG nova.compute.manager [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Refreshing instance network info cache due to event network-changed-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:10:46 np0005474864 nova_compute[192593]: 2025-10-07 20:10:46.840 2 DEBUG oslo_concurrency.lockutils [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:10:46 np0005474864 nova_compute[192593]: 2025-10-07 20:10:46.840 2 DEBUG oslo_concurrency.lockutils [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:10:46 np0005474864 nova_compute[192593]: 2025-10-07 20:10:46.841 2 DEBUG nova.network.neutron [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Refreshing network info cache for port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:10:47 np0005474864 nova_compute[192593]: 2025-10-07 20:10:47.463 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:47 np0005474864 nova_compute[192593]: 2025-10-07 20:10:47.982 2 DEBUG nova.network.neutron [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updated VIF entry in instance network info cache for port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:10:47 np0005474864 nova_compute[192593]: 2025-10-07 20:10:47.983 2 DEBUG nova.network.neutron [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updating instance_info_cache with network_info: [{"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:10:48 np0005474864 nova_compute[192593]: 2025-10-07 20:10:48.007 2 DEBUG oslo_concurrency.lockutils [req-9626db08-25aa-44e4-8389-43993886a6c7 req-d7d7796f-85cf-4e81-b1ed-bf847031a14c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:10:49 np0005474864 nova_compute[192593]: 2025-10-07 20:10:49.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:50 np0005474864 nova_compute[192593]: 2025-10-07 20:10:50.152 2 DEBUG nova.compute.manager [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-changed-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:10:50 np0005474864 nova_compute[192593]: 2025-10-07 20:10:50.153 2 DEBUG nova.compute.manager [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Refreshing instance network info cache due to event network-changed-9283f59d-4eb5-4e2a-876d-b078582f6dec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:10:50 np0005474864 nova_compute[192593]: 2025-10-07 20:10:50.154 2 DEBUG oslo_concurrency.lockutils [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:10:50 np0005474864 nova_compute[192593]: 2025-10-07 20:10:50.154 2 DEBUG oslo_concurrency.lockutils [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:10:50 np0005474864 nova_compute[192593]: 2025-10-07 20:10:50.155 2 DEBUG nova.network.neutron [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Refreshing network info cache for port 9283f59d-4eb5-4e2a-876d-b078582f6dec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:10:51 np0005474864 nova_compute[192593]: 2025-10-07 20:10:51.393 2 DEBUG nova.network.neutron [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updated VIF entry in instance network info cache for port 9283f59d-4eb5-4e2a-876d-b078582f6dec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:10:51 np0005474864 nova_compute[192593]: 2025-10-07 20:10:51.395 2 DEBUG nova.network.neutron [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updating instance_info_cache with network_info: [{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:10:51 np0005474864 nova_compute[192593]: 2025-10-07 20:10:51.422 2 DEBUG oslo_concurrency.lockutils [req-eabbcb04-b62e-4ff6-aafa-d7c17d254204 req-d1577bf2-8f69-4938-bcb7-ba48bd3e1efb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:10:52 np0005474864 nova_compute[192593]: 2025-10-07 20:10:52.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:53 np0005474864 podman[220833]: 2025-10-07 20:10:53.396394686 +0000 UTC m=+0.085572592 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, name=ubi9-minimal, config_id=edpm, distribution-scope=public, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41)
Oct  7 16:10:53 np0005474864 podman[220832]: 2025-10-07 20:10:53.402585104 +0000 UTC m=+0.090679269 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:10:53 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:53Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5d:0b:e9 10.100.0.11
Oct  7 16:10:53 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:53Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5d:0b:e9 10.100.0.11
Oct  7 16:10:54 np0005474864 nova_compute[192593]: 2025-10-07 20:10:54.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:56Z|00039|memory|INFO|peak resident set size grew 52% in last 1038.3 seconds, from 16896 kB to 25728 kB
Oct  7 16:10:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:56Z|00040|memory|INFO|idl-cells-OVN_Southbound:13600 idl-cells-Open_vSwitch:927 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:490 lflow-cache-entries-cache-matches:320 lflow-cache-size-KB:1992 local_datapath_usage-KB:3 ofctrl_desired_flow_usage-KB:832 ofctrl_installed_flow_usage-KB:610 ofctrl_sb_flow_ref_usage-KB:307
Oct  7 16:10:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:57Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:6e:bc:c6 10.100.0.12
Oct  7 16:10:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:10:57Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:6e:bc:c6 10.100.0.12
Oct  7 16:10:57 np0005474864 nova_compute[192593]: 2025-10-07 20:10:57.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:59 np0005474864 podman[220894]: 2025-10-07 20:10:59.38762973 +0000 UTC m=+0.070093376 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3)
Oct  7 16:10:59 np0005474864 podman[220892]: 2025-10-07 20:10:59.392366667 +0000 UTC m=+0.080908308 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  7 16:10:59 np0005474864 nova_compute[192593]: 2025-10-07 20:10:59.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:10:59 np0005474864 podman[220893]: 2025-10-07 20:10:59.465433598 +0000 UTC m=+0.152286610 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:11:00 np0005474864 nova_compute[192593]: 2025-10-07 20:11:00.768 2 INFO nova.compute.manager [None req-fc17e27a-75f6-4171-9f53-badae9900b1b fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Get console output#033[00m
Oct  7 16:11:00 np0005474864 nova_compute[192593]: 2025-10-07 20:11:00.873 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:11:01 np0005474864 nova_compute[192593]: 2025-10-07 20:11:01.189 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:02 np0005474864 nova_compute[192593]: 2025-10-07 20:11:02.473 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:03 np0005474864 nova_compute[192593]: 2025-10-07 20:11:03.297 2 INFO nova.compute.manager [None req-dfa3db6a-9c63-406c-ba93-c0ad0119a0ed db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Get console output#033[00m
Oct  7 16:11:03 np0005474864 nova_compute[192593]: 2025-10-07 20:11:03.304 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:11:03 np0005474864 podman[220954]: 2025-10-07 20:11:03.40458274 +0000 UTC m=+0.080916168 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:11:04 np0005474864 nova_compute[192593]: 2025-10-07 20:11:04.448 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:05 np0005474864 nova_compute[192593]: 2025-10-07 20:11:05.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:05 np0005474864 nova_compute[192593]: 2025-10-07 20:11:05.610 2 INFO nova.compute.manager [None req-a7b0216b-f6a4-4c0f-953d-1832fcc1ce6f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Get console output#033[00m
Oct  7 16:11:05 np0005474864 nova_compute[192593]: 2025-10-07 20:11:05.617 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:11:07 np0005474864 podman[220973]: 2025-10-07 20:11:07.36519706 +0000 UTC m=+0.057339360 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:11:07 np0005474864 nova_compute[192593]: 2025-10-07 20:11:07.474 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:08 np0005474864 nova_compute[192593]: 2025-10-07 20:11:08.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:09 np0005474864 nova_compute[192593]: 2025-10-07 20:11:09.062 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Acquiring lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:09 np0005474864 nova_compute[192593]: 2025-10-07 20:11:09.063 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Acquired lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:09 np0005474864 nova_compute[192593]: 2025-10-07 20:11:09.064 2 DEBUG nova.network.neutron [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:11:09 np0005474864 nova_compute[192593]: 2025-10-07 20:11:09.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:10 np0005474864 podman[220997]: 2025-10-07 20:11:10.40189303 +0000 UTC m=+0.088751614 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, config_id=edpm)
Oct  7 16:11:12 np0005474864 nova_compute[192593]: 2025-10-07 20:11:12.478 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:12.999 2 DEBUG nova.network.neutron [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updating instance_info_cache with network_info: [{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.023 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Releasing lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.133 2 DEBUG nova.virt.libvirt.driver [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.134 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Creating file /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/967e45df317c4972bf62aec9cf8a7095.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.134 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/967e45df317c4972bf62aec9cf8a7095.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.565 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/967e45df317c4972bf62aec9cf8a7095.tmp" returned: 1 in 0.430s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.566 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/967e45df317c4972bf62aec9cf8a7095.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.566 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Creating directory /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.567 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.696 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.697 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.717 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.789 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:13 np0005474864 nova_compute[192593]: 2025-10-07 20:11:13.794 2 DEBUG nova.virt.libvirt.driver [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.088 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.089 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.113 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.114 2 INFO nova.compute.claims [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.433 2 DEBUG nova.compute.provider_tree [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.455 2 DEBUG nova.scheduler.client.report [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.493 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.404s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.494 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.588 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.588 2 DEBUG nova.network.neutron [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.617 2 INFO nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.637 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.729 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.731 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.732 2 INFO nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Creating image(s)#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.732 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "/var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.733 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.734 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.749 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.817 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.818 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.819 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.833 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.913 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.916 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.977 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk 1073741824" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.979 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.160s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:14 np0005474864 nova_compute[192593]: 2025-10-07 20:11:14.980 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.037 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.039 2 DEBUG nova.virt.disk.api [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Checking if we can resize image /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.040 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.101 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.101 2 DEBUG nova.virt.disk.api [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Cannot resize image /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.102 2 DEBUG nova.objects.instance [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'migration_context' on Instance uuid b00f20b4-40d9-4fe7-8782-20859f161134 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.132 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.133 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Ensure instance console log exists: /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.133 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.134 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.134 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:15 np0005474864 nova_compute[192593]: 2025-10-07 20:11:15.195 2 DEBUG nova.policy [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.183 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.184 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.185 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:16 np0005474864 kernel: tap9283f59d-4e (unregistering): left promiscuous mode
Oct  7 16:11:16 np0005474864 NetworkManager[51631]: <info>  [1759867876.4003] device (tap9283f59d-4e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:16 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:16Z|00041|binding|INFO|Releasing lport 9283f59d-4eb5-4e2a-876d-b078582f6dec from this chassis (sb_readonly=0)
Oct  7 16:11:16 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:16Z|00042|binding|INFO|Setting lport 9283f59d-4eb5-4e2a-876d-b078582f6dec down in Southbound
Oct  7 16:11:16 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:16Z|00043|binding|INFO|Removing iface tap9283f59d-4e ovn-installed in OVS
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.420 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6e:bc:c6 10.100.0.12'], port_security=['fa:16:3e:6e:bc:c6 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '31cd065b-2fe3-418f-869b-a5ac7f4405f8', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3c6f15ee-a1fe-4807-8291-599e41409640', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'c698aec5-5b02-413c-9427-198232253cc4', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.178'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=63dd33cd-4919-4f9e-b01e-a4cd30047532, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=9283f59d-4eb5-4e2a-876d-b078582f6dec) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.422 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 9283f59d-4eb5-4e2a-876d-b078582f6dec in datapath 3c6f15ee-a1fe-4807-8291-599e41409640 unbound from our chassis#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.426 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3c6f15ee-a1fe-4807-8291-599e41409640, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.428 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[88aa1e60-aaf9-4b26-8c88-bdb52974680a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:16.429 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640 namespace which is not needed anymore#033[00m
Oct  7 16:11:16 np0005474864 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Deactivated successfully.
Oct  7 16:11:16 np0005474864 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000003.scope: Consumed 14.273s CPU time.
Oct  7 16:11:16 np0005474864 systemd-machined[152586]: Machine qemu-1-instance-00000003 terminated.
Oct  7 16:11:16 np0005474864 neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640[220527]: [NOTICE]   (220531) : haproxy version is 2.8.14-c23fe91
Oct  7 16:11:16 np0005474864 neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640[220527]: [NOTICE]   (220531) : path to executable is /usr/sbin/haproxy
Oct  7 16:11:16 np0005474864 neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640[220527]: [WARNING]  (220531) : Exiting Master process...
Oct  7 16:11:16 np0005474864 neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640[220527]: [ALERT]    (220531) : Current worker (220533) exited with code 143 (Terminated)
Oct  7 16:11:16 np0005474864 neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640[220527]: [WARNING]  (220531) : All workers exited. Exiting... (0)
Oct  7 16:11:16 np0005474864 systemd[1]: libpod-c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5.scope: Deactivated successfully.
Oct  7 16:11:16 np0005474864 podman[221057]: 2025-10-07 20:11:16.639512453 +0000 UTC m=+0.094575121 container died c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:11:16 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5-userdata-shm.mount: Deactivated successfully.
Oct  7 16:11:16 np0005474864 systemd[1]: var-lib-containers-storage-overlay-09d66dc9225f4ae99f35730a2ed566c60cf3527cb5f5f9599832757c233eb4a7-merged.mount: Deactivated successfully.
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.813 2 INFO nova.virt.libvirt.driver [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Instance shutdown successfully after 3 seconds.#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.821 2 INFO nova.virt.libvirt.driver [-] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Instance destroyed successfully.#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.823 2 DEBUG nova.virt.libvirt.vif [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1391753478',display_name='tempest-TestNetworkAdvancedServerOps-server-1391753478',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1391753478',id=3,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHGp9K/m1XQlBJloQMlWOiYAkMHRg/+YyV7EIFeU64B1nJDtz1wGfsQsDxfqhOEvcl/IBS6gweH/4Fue49rFzrh66+jFDwTRyWcSgsUsGaMU3Uma/s2qqLF3+L5vxqg9xw==',key_name='tempest-TestNetworkAdvancedServerOps-1360006664',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:10:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-6hld8jn4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:11:08Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=31cd065b-2fe3-418f-869b-a5ac7f4405f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1071304693", "vif_mac": "fa:16:3e:6e:bc:c6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.824 2 DEBUG nova.network.os_vif_util [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Converting VIF {"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1071304693", "vif_mac": "fa:16:3e:6e:bc:c6"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.825 2 DEBUG nova.network.os_vif_util [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.826 2 DEBUG os_vif [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.830 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.830 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9283f59d-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.834 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.838 2 INFO os_vif [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e')#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.844 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:16 np0005474864 podman[221057]: 2025-10-07 20:11:16.887480844 +0000 UTC m=+0.342543452 container cleanup c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:11:16 np0005474864 systemd[1]: libpod-conmon-c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5.scope: Deactivated successfully.
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.959 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json" returned: 0 in 0.114s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:16 np0005474864 nova_compute[192593]: 2025-10-07 20:11:16.960 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.033 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.036 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Copying file /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk to 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.037 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:17 np0005474864 podman[221107]: 2025-10-07 20:11:17.078846846 +0000 UTC m=+0.161887316 container remove c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.086 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[29de9a42-bd8c-451f-bae2-2acfb6d89015]: (4, ('Tue Oct  7 08:11:16 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640 (c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5)\nc41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5\nTue Oct  7 08:11:16 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640 (c41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5)\nc41131ea94a970e003535bc3b843d3b7bb0691733ff41bd4e9d4ba7c84e575e5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.089 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[70c7d036-546d-41e2-b164-1ea8bea6ce23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.090 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3c6f15ee-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:17 np0005474864 kernel: tap3c6f15ee-a0: left promiscuous mode
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.095 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.109 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[870cb47b-1510-49ad-8b90-2e9b4cf1c274]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.119 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.120 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.120 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.120 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.142 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[29eaea92-32e9-4c44-8e33-ee28a66b8b04]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.144 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[19fd1601-c0b7-47d4-a89f-e1f25c91ad3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.167 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3d03fd01-1a8d-43b2-928f-8f97f4507f7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345397, 'reachable_time': 32408, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221128, 'error': None, 'target': 'ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:17 np0005474864 systemd[1]: run-netns-ovnmeta\x2d3c6f15ee\x2da1fe\x2d4807\x2d8291\x2d599e41409640.mount: Deactivated successfully.
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.181 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3c6f15ee-a1fe-4807-8291-599e41409640 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:11:17 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:17.181 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[e1073e8e-f6aa-4f08-92f5-e742b9766187]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.205 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.273 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.274 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.357 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679/disk --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.368 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Periodic task is updating the host stats, it is trying to get disk info for instance-00000003, but the backing disk storage was removed by a concurrent operation such as resize. Error: No disk at /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk: nova.exception.DiskNotFound: No disk at /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.532 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.533 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5592MB free_disk=73.40980911254883GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.534 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.534 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.603 2 INFO nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updating resource usage from migration 1b763f3f-6a93-4f21-a96f-734cc9bc61be#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.629 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.630 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance b00f20b4-40d9-4fe7-8782-20859f161134 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.630 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Migration 1b763f3f-6a93-4f21-a96f-734cc9bc61be is active on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1640#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.631 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 3 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.631 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=896MB phys_disk=79GB used_disk=3GB total_vcpus=8 used_vcpus=3 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.763 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.767 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] CMD "scp -r /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk" returned: 0 in 0.731s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.769 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Copying file /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.770 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk.config 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.796 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.839 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:11:17 np0005474864 nova_compute[192593]: 2025-10-07 20:11:17.840 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.057 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] CMD "scp -C -r /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk.config 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.config" returned: 0 in 0.287s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.058 2 DEBUG nova.virt.libvirt.volume.remotefs [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Copying file /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.059 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk.info 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.320 2 DEBUG oslo_concurrency.processutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] CMD "scp -C -r /var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8_resize/disk.info 192.168.122.101:/var/lib/nova/instances/31cd065b-2fe3-418f-869b-a5ac7f4405f8/disk.info" returned: 0 in 0.261s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.549 2 DEBUG neutronclient.v2_0.client [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 9283f59d-4eb5-4e2a-876d-b078582f6dec for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.581 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.582 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.626 2 DEBUG nova.network.neutron [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Successfully created port: 18c97918-cded-43d4-9ccc-3569cc948551 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.656 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.698 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.699 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.717 2 INFO nova.compute.rpcapi [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.718 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.732 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.732 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.733 2 DEBUG oslo_concurrency.lockutils [None req-a24c3faf-b2a8-4ee5-9b65-7309be8fa493 88161d5a45864c12bc57301ea1fd590d 569aea2af32c48a49558145e66e4fc9f - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.747 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.747 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.753 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.753 2 INFO nova.compute.claims [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.838 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.968 2 DEBUG nova.compute.provider_tree [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:11:18 np0005474864 nova_compute[192593]: 2025-10-07 20:11:18.992 2 DEBUG nova.scheduler.client.report [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.030 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.283s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.032 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.081 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.082 2 DEBUG nova.network.neutron [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.087 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.107 2 INFO nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.128 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.227 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.229 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.230 2 INFO nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Creating image(s)#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.231 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "/var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.232 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "/var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.234 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "/var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.259 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.332 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.334 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.336 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.358 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.437 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.438 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.755 2 DEBUG nova.policy [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'c01210eeec574b6e98f04c03e858c140', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'eb6dd99c8537434e826f505f7c17fb9d', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.836 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk 1073741824" returned: 0 in 0.398s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.837 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.502s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.838 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.941 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.103s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.942 2 DEBUG nova.virt.disk.api [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Checking if we can resize image /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:11:19 np0005474864 nova_compute[192593]: 2025-10-07 20:11:19.943 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.009 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.011 2 DEBUG nova.virt.disk.api [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Cannot resize image /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.012 2 DEBUG nova.objects.instance [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lazy-loading 'migration_context' on Instance uuid 905ba276-3439-4ffc-9fa7-b8ce71d79b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.034 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.035 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Ensure instance console log exists: /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.036 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.036 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.037 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:20 np0005474864 nova_compute[192593]: 2025-10-07 20:11:20.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.089 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.127 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.411 2 DEBUG nova.network.neutron [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Successfully updated port: 18c97918-cded-43d4-9ccc-3569cc948551 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.434 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "refresh_cache-b00f20b4-40d9-4fe7-8782-20859f161134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.435 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquired lock "refresh_cache-b00f20b4-40d9-4fe7-8782-20859f161134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.435 2 DEBUG nova.network.neutron [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.644 2 DEBUG nova.compute.manager [req-b4ef438b-a4b4-4c0b-a6a8-c813d87e1e5b req-51717813-608a-455a-af74-b647a4345f3f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-vif-unplugged-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.645 2 DEBUG oslo_concurrency.lockutils [req-b4ef438b-a4b4-4c0b-a6a8-c813d87e1e5b req-51717813-608a-455a-af74-b647a4345f3f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.645 2 DEBUG oslo_concurrency.lockutils [req-b4ef438b-a4b4-4c0b-a6a8-c813d87e1e5b req-51717813-608a-455a-af74-b647a4345f3f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.646 2 DEBUG oslo_concurrency.lockutils [req-b4ef438b-a4b4-4c0b-a6a8-c813d87e1e5b req-51717813-608a-455a-af74-b647a4345f3f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.646 2 DEBUG nova.compute.manager [req-b4ef438b-a4b4-4c0b-a6a8-c813d87e1e5b req-51717813-608a-455a-af74-b647a4345f3f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] No waiting events found dispatching network-vif-unplugged-9283f59d-4eb5-4e2a-876d-b078582f6dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.647 2 WARNING nova.compute.manager [req-b4ef438b-a4b4-4c0b-a6a8-c813d87e1e5b req-51717813-608a-455a-af74-b647a4345f3f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received unexpected event network-vif-unplugged-9283f59d-4eb5-4e2a-876d-b078582f6dec for instance with vm_state active and task_state resize_migrated.#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.755 2 DEBUG nova.network.neutron [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:21 np0005474864 nova_compute[192593]: 2025-10-07 20:11:21.933 2 DEBUG nova.network.neutron [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Successfully created port: 2f616ccf-0f40-4768-8b90-07dae3707b82 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:11:22 np0005474864 nova_compute[192593]: 2025-10-07 20:11:22.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:22 np0005474864 nova_compute[192593]: 2025-10-07 20:11:22.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:11:22 np0005474864 nova_compute[192593]: 2025-10-07 20:11:22.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:11:22 np0005474864 nova_compute[192593]: 2025-10-07 20:11:22.113 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  7 16:11:22 np0005474864 nova_compute[192593]: 2025-10-07 20:11:22.114 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  7 16:11:22 np0005474864 nova_compute[192593]: 2025-10-07 20:11:22.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.160 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.160 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquired lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.160 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.160 2 DEBUG nova.objects.instance [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.923 2 DEBUG nova.network.neutron [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Updating instance_info_cache with network_info: [{"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.953 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Releasing lock "refresh_cache-b00f20b4-40d9-4fe7-8782-20859f161134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.953 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Instance network_info: |[{"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.956 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Start _get_guest_xml network_info=[{"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.964 2 WARNING nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.969 2 DEBUG nova.virt.libvirt.host [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.970 2 DEBUG nova.virt.libvirt.host [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.973 2 DEBUG nova.virt.libvirt.host [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.973 2 DEBUG nova.virt.libvirt.host [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.975 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.975 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.976 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.976 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.977 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.977 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.977 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.977 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.978 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.978 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.978 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.978 2 DEBUG nova.virt.hardware [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.983 2 DEBUG nova.virt.libvirt.vif [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:11:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-36228192',display_name='tempest-TestNetworkBasicOps-server-36228192',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-36228192',id=11,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBETfy9GNnTELxCLuS0VSQmy+oeDAuygUBwGUg5UFOaSrJAPx2qJYkaTHuzeoQ/vg5ArfEzqe69+o7Q9PtFBcHv7rv9msY/iszJVgKdNyHhqs7VTb3gngtn9e2I03GN5ceQ==',key_name='tempest-TestNetworkBasicOps-568178070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-2kz6cpbc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:11:14Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=b00f20b4-40d9-4fe7-8782-20859f161134,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.984 2 DEBUG nova.network.os_vif_util [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.985 2 DEBUG nova.network.os_vif_util [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c9:ed,bridge_name='br-int',has_traffic_filtering=True,id=18c97918-cded-43d4-9ccc-3569cc948551,network=Network(116d80b3-97b2-4699-8d30-7d57fd9728fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18c97918-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:11:23 np0005474864 nova_compute[192593]: 2025-10-07 20:11:23.986 2 DEBUG nova.objects.instance [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'pci_devices' on Instance uuid b00f20b4-40d9-4fe7-8782-20859f161134 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.004 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <uuid>b00f20b4-40d9-4fe7-8782-20859f161134</uuid>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <name>instance-0000000b</name>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkBasicOps-server-36228192</nova:name>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:11:23</nova:creationTime>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        <nova:port uuid="18c97918-cded-43d4-9ccc-3569cc948551">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.27" ipVersion="4"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <entry name="serial">b00f20b4-40d9-4fe7-8782-20859f161134</entry>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <entry name="uuid">b00f20b4-40d9-4fe7-8782-20859f161134</entry>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk.config"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:d6:c9:ed"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <target dev="tap18c97918-cd"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/console.log" append="off"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:11:24 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:11:24 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:11:24 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:11:24 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.006 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Preparing to wait for external event network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.007 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.008 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.008 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.010 2 DEBUG nova.virt.libvirt.vif [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:11:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-36228192',display_name='tempest-TestNetworkBasicOps-server-36228192',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-36228192',id=11,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBETfy9GNnTELxCLuS0VSQmy+oeDAuygUBwGUg5UFOaSrJAPx2qJYkaTHuzeoQ/vg5ArfEzqe69+o7Q9PtFBcHv7rv9msY/iszJVgKdNyHhqs7VTb3gngtn9e2I03GN5ceQ==',key_name='tempest-TestNetworkBasicOps-568178070',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-2kz6cpbc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:11:14Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=b00f20b4-40d9-4fe7-8782-20859f161134,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.011 2 DEBUG nova.network.os_vif_util [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.013 2 DEBUG nova.network.os_vif_util [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c9:ed,bridge_name='br-int',has_traffic_filtering=True,id=18c97918-cded-43d4-9ccc-3569cc948551,network=Network(116d80b3-97b2-4699-8d30-7d57fd9728fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18c97918-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.014 2 DEBUG os_vif [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c9:ed,bridge_name='br-int',has_traffic_filtering=True,id=18c97918-cded-43d4-9ccc-3569cc948551,network=Network(116d80b3-97b2-4699-8d30-7d57fd9728fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18c97918-cd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.016 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.017 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.022 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap18c97918-cd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.023 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap18c97918-cd, col_values=(('external_ids', {'iface-id': '18c97918-cded-43d4-9ccc-3569cc948551', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:c9:ed', 'vm-uuid': 'b00f20b4-40d9-4fe7-8782-20859f161134'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:24 np0005474864 NetworkManager[51631]: <info>  [1759867884.0265] manager: (tap18c97918-cd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/35)
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.036 2 INFO os_vif [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c9:ed,bridge_name='br-int',has_traffic_filtering=True,id=18c97918-cded-43d4-9ccc-3569cc948551,network=Network(116d80b3-97b2-4699-8d30-7d57fd9728fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18c97918-cd')#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.132 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.132 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.133 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No VIF found with MAC fa:16:3e:d6:c9:ed, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:11:24 np0005474864 nova_compute[192593]: 2025-10-07 20:11:24.133 2 INFO nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Using config drive#033[00m
Oct  7 16:11:24 np0005474864 podman[221157]: 2025-10-07 20:11:24.411058457 +0000 UTC m=+0.096191707 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:11:24 np0005474864 podman[221158]: 2025-10-07 20:11:24.445101626 +0000 UTC m=+0.126318054 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., config_id=edpm)
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.085 2 DEBUG nova.network.neutron [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Successfully updated port: 2f616ccf-0f40-4768-8b90-07dae3707b82 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.117 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.118 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquired lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.119 2 DEBUG nova.network.neutron [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.139 2 INFO nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Creating config drive at /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk.config#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.144 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwpox8bkl execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.283 2 DEBUG nova.compute.manager [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.284 2 DEBUG oslo_concurrency.lockutils [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.285 2 DEBUG oslo_concurrency.lockutils [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.286 2 DEBUG oslo_concurrency.lockutils [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.287 2 DEBUG nova.compute.manager [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] No waiting events found dispatching network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.287 2 WARNING nova.compute.manager [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received unexpected event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec for instance with vm_state active and task_state resize_finish.#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.288 2 DEBUG nova.compute.manager [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received event network-changed-18c97918-cded-43d4-9ccc-3569cc948551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.288 2 DEBUG nova.compute.manager [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Refreshing instance network info cache due to event network-changed-18c97918-cded-43d4-9ccc-3569cc948551. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.289 2 DEBUG oslo_concurrency.lockutils [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-b00f20b4-40d9-4fe7-8782-20859f161134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.289 2 DEBUG oslo_concurrency.lockutils [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-b00f20b4-40d9-4fe7-8782-20859f161134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.289 2 DEBUG nova.network.neutron [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Refreshing network info cache for port 18c97918-cded-43d4-9ccc-3569cc948551 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.292 2 DEBUG oslo_concurrency.processutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwpox8bkl" returned: 0 in 0.148s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.326 2 DEBUG nova.compute.manager [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-changed-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.327 2 DEBUG nova.compute.manager [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Refreshing instance network info cache due to event network-changed-9283f59d-4eb5-4e2a-876d-b078582f6dec. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.328 2 DEBUG oslo_concurrency.lockutils [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.328 2 DEBUG oslo_concurrency.lockutils [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.329 2 DEBUG nova.network.neutron [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Refreshing network info cache for port 9283f59d-4eb5-4e2a-876d-b078582f6dec _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:11:25 np0005474864 NetworkManager[51631]: <info>  [1759867885.3550] manager: (tap18c97918-cd): new Tun device (/org/freedesktop/NetworkManager/Devices/36)
Oct  7 16:11:25 np0005474864 kernel: tap18c97918-cd: entered promiscuous mode
Oct  7 16:11:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:25Z|00044|binding|INFO|Claiming lport 18c97918-cded-43d4-9ccc-3569cc948551 for this chassis.
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:25Z|00045|binding|INFO|18c97918-cded-43d4-9ccc-3569cc948551: Claiming fa:16:3e:d6:c9:ed 10.100.0.27
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.363 2 DEBUG nova.network.neutron [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.377 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c9:ed 10.100.0.27'], port_security=['fa:16:3e:d6:c9:ed 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': 'b00f20b4-40d9-4fe7-8782-20859f161134', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '2', 'neutron:security_group_ids': '18e2fd14-c88a-4a66-8665-cb12f02155ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1aac10a9-f2b0-4902-ab46-4b1f397c010b, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=18c97918-cded-43d4-9ccc-3569cc948551) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.378 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 18c97918-cded-43d4-9ccc-3569cc948551 in datapath 116d80b3-97b2-4699-8d30-7d57fd9728fe bound to our chassis#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.381 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 116d80b3-97b2-4699-8d30-7d57fd9728fe#033[00m
Oct  7 16:11:25 np0005474864 systemd-udevd[221223]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:11:25 np0005474864 NetworkManager[51631]: <info>  [1759867885.3974] device (tap18c97918-cd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:11:25 np0005474864 NetworkManager[51631]: <info>  [1759867885.4003] device (tap18c97918-cd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.400 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a67900dd-7053-4ab6-a0a8-478a638c30be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.401 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap116d80b3-91 in ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.403 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap116d80b3-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.403 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[55c0aef3-2231-46d9-853b-bf67f69ee528]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.404 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[41cd9be2-ec85-4164-9292-e8551b97593f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 systemd-machined[152586]: New machine qemu-3-instance-0000000b.
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.417 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[7f00651b-eb63-426e-91b9-c5dd822ee54f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:25 np0005474864 systemd[1]: Started Virtual Machine qemu-3-instance-0000000b.
Oct  7 16:11:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:25Z|00046|binding|INFO|Setting lport 18c97918-cded-43d4-9ccc-3569cc948551 ovn-installed in OVS
Oct  7 16:11:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:25Z|00047|binding|INFO|Setting lport 18c97918-cded-43d4-9ccc-3569cc948551 up in Southbound
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.439 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c8a66143-5f26-4d24-95ce-3d0d3e62ff9e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.467 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[43599f61-7b06-446f-947a-0ede82aad9a7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 NetworkManager[51631]: <info>  [1759867885.4737] manager: (tap116d80b3-90): new Veth device (/org/freedesktop/NetworkManager/Devices/37)
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.472 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e0256c17-3e3b-47a2-a4dd-d4b4323fc3cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.522 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[d721cb01-34e9-4b26-8838-ebb96ce4fc68]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.527 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[0acbb598-041c-46ee-bebc-2d82444dc47c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 NetworkManager[51631]: <info>  [1759867885.5647] device (tap116d80b3-90): carrier: link connected
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.572 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[213ae79d-1f67-4082-a2c2-1932f3bb65ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.573 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updating instance_info_cache with network_info: [{"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.592 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Releasing lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.592 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.593 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.593 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.594 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.605 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5149d4-5c8b-4fae-871f-4f06034b3577]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap116d80b3-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:01:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352145, 'reachable_time': 19286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221258, 'error': None, 'target': 'ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.637 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6bbe138a-e11e-4439-9b27-e7ce5ec66216]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe78:11e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 352145, 'tstamp': 352145}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221259, 'error': None, 'target': 'ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.669 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a836985f-f6d0-4a77-8983-f1ece7f74917]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap116d80b3-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:78:01:1e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 19], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352145, 'reachable_time': 19286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221260, 'error': None, 'target': 'ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.713 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fbc94f05-d999-49e7-8b9a-f55c4f41a1b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.798 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5db9c09a-6a4d-4769-99b3-64504f57dbca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.800 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap116d80b3-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.801 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.802 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap116d80b3-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:25 np0005474864 NetworkManager[51631]: <info>  [1759867885.8066] manager: (tap116d80b3-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/38)
Oct  7 16:11:25 np0005474864 kernel: tap116d80b3-90: entered promiscuous mode
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.811 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap116d80b3-90, col_values=(('external_ids', {'iface-id': '3ef78041-272c-48f8-bb33-3cc1fdd4d058'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:25Z|00048|binding|INFO|Releasing lport 3ef78041-272c-48f8-bb33-3cc1fdd4d058 from this chassis (sb_readonly=0)
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:25 np0005474864 nova_compute[192593]: 2025-10-07 20:11:25.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.839 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/116d80b3-97b2-4699-8d30-7d57fd9728fe.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/116d80b3-97b2-4699-8d30-7d57fd9728fe.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.840 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9b8777d1-b986-43d4-acec-66774cf4dc97]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.841 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-116d80b3-97b2-4699-8d30-7d57fd9728fe
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/116d80b3-97b2-4699-8d30-7d57fd9728fe.pid.haproxy
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 116d80b3-97b2-4699-8d30-7d57fd9728fe
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:11:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:25.842 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'env', 'PROCESS_TAG=haproxy-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/116d80b3-97b2-4699-8d30-7d57fd9728fe.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:11:26 np0005474864 podman[221299]: 2025-10-07 20:11:26.251054842 +0000 UTC m=+0.054364904 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.394 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867886.393508, b00f20b4-40d9-4fe7-8782-20859f161134 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.395 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] VM Started (Lifecycle Event)#033[00m
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.426 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.432 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867886.3937526, b00f20b4-40d9-4fe7-8782-20859f161134 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.433 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.452 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.460 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:11:26 np0005474864 nova_compute[192593]: 2025-10-07 20:11:26.482 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:11:26 np0005474864 podman[221299]: 2025-10-07 20:11:26.659529289 +0000 UTC m=+0.462839261 container create 86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:11:26 np0005474864 systemd[1]: Started libpod-conmon-86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7.scope.
Oct  7 16:11:26 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:11:26 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/9909fdef655b4d84b97b756481f3c60130c0e5943d49ac4883d2e39fd83b72e7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:11:27 np0005474864 podman[221299]: 2025-10-07 20:11:27.023993381 +0000 UTC m=+0.827303423 container init 86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:11:27 np0005474864 podman[221299]: 2025-10-07 20:11:27.031134537 +0000 UTC m=+0.834444529 container start 86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:11:27 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [NOTICE]   (221317) : New worker (221319) forked
Oct  7 16:11:27 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [NOTICE]   (221317) : Loading success.
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.372 2 DEBUG nova.network.neutron [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updating instance_info_cache with network_info: [{"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.393 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Releasing lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.393 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Instance network_info: |[{"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.398 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Start _get_guest_xml network_info=[{"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.404 2 WARNING nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.411 2 DEBUG nova.virt.libvirt.host [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.412 2 DEBUG nova.virt.libvirt.host [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.417 2 DEBUG nova.virt.libvirt.host [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.418 2 DEBUG nova.virt.libvirt.host [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.420 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.421 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.423 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.424 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.424 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.424 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.425 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.425 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.426 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.426 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.427 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.427 2 DEBUG nova.virt.hardware [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.435 2 DEBUG nova.virt.libvirt.vif [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:11:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1096783716-access_point-474518308',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1096783716-access_point-474518308',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1096783716-ac',id=12,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGDkqxmMN+Aruikf/t9ZJtlDoMcoyfxz6sKwIlxv989FdnDhDVp3Dd9igEHauIxEyymp1djFrT/aIF60PZHGcUp63X92MHlpEGvJzVqfsnrqYyF8hZ+ZfmxbiJh7Y+ZIOQ==',key_name='tempest-TestSecurityGroupsBasicOps-1494070395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb6dd99c8537434e826f505f7c17fb9d',ramdisk_id='',reservation_id='r-8x4964bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1096783716',owner_user_name='tempest-TestSecurityGroupsBasicOps-1096783716-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:11:19Z,user_data=None,user_id='c01210eeec574b6e98f04c03e858c140',uuid=905ba276-3439-4ffc-9fa7-b8ce71d79b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.435 2 DEBUG nova.network.os_vif_util [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Converting VIF {"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.437 2 DEBUG nova.network.os_vif_util [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:ac:0d,bridge_name='br-int',has_traffic_filtering=True,id=2f616ccf-0f40-4768-8b90-07dae3707b82,network=Network(872b0f73-e6f1-41ef-b96e-e40b61240904),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f616ccf-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.439 2 DEBUG nova.objects.instance [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lazy-loading 'pci_devices' on Instance uuid 905ba276-3439-4ffc-9fa7-b8ce71d79b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.453 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <uuid>905ba276-3439-4ffc-9fa7-b8ce71d79b96</uuid>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <name>instance-0000000c</name>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1096783716-access_point-474518308</nova:name>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:11:27</nova:creationTime>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:user uuid="c01210eeec574b6e98f04c03e858c140">tempest-TestSecurityGroupsBasicOps-1096783716-project-member</nova:user>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:project uuid="eb6dd99c8537434e826f505f7c17fb9d">tempest-TestSecurityGroupsBasicOps-1096783716</nova:project>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        <nova:port uuid="2f616ccf-0f40-4768-8b90-07dae3707b82">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <entry name="serial">905ba276-3439-4ffc-9fa7-b8ce71d79b96</entry>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <entry name="uuid">905ba276-3439-4ffc-9fa7-b8ce71d79b96</entry>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk.config"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:d6:ac:0d"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <target dev="tap2f616ccf-0f"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/console.log" append="off"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:11:27 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:11:27 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:11:27 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:11:27 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.454 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Preparing to wait for external event network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.455 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.455 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.455 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.456 2 DEBUG nova.virt.libvirt.vif [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:11:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1096783716-access_point-474518308',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1096783716-access_point-474518308',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1096783716-ac',id=12,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGDkqxmMN+Aruikf/t9ZJtlDoMcoyfxz6sKwIlxv989FdnDhDVp3Dd9igEHauIxEyymp1djFrT/aIF60PZHGcUp63X92MHlpEGvJzVqfsnrqYyF8hZ+ZfmxbiJh7Y+ZIOQ==',key_name='tempest-TestSecurityGroupsBasicOps-1494070395',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='eb6dd99c8537434e826f505f7c17fb9d',ramdisk_id='',reservation_id='r-8x4964bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1096783716',owner_user_name='tempest-TestSecurityGroupsBasicOps-1096783716-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:11:19Z,user_data=None,user_id='c01210eeec574b6e98f04c03e858c140',uuid=905ba276-3439-4ffc-9fa7-b8ce71d79b96,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.456 2 DEBUG nova.network.os_vif_util [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Converting VIF {"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.457 2 DEBUG nova.network.os_vif_util [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:ac:0d,bridge_name='br-int',has_traffic_filtering=True,id=2f616ccf-0f40-4768-8b90-07dae3707b82,network=Network(872b0f73-e6f1-41ef-b96e-e40b61240904),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f616ccf-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.457 2 DEBUG os_vif [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:ac:0d,bridge_name='br-int',has_traffic_filtering=True,id=2f616ccf-0f40-4768-8b90-07dae3707b82,network=Network(872b0f73-e6f1-41ef-b96e-e40b61240904),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f616ccf-0f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.458 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.458 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.461 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap2f616ccf-0f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.461 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap2f616ccf-0f, col_values=(('external_ids', {'iface-id': '2f616ccf-0f40-4768-8b90-07dae3707b82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d6:ac:0d', 'vm-uuid': '905ba276-3439-4ffc-9fa7-b8ce71d79b96'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:27 np0005474864 NetworkManager[51631]: <info>  [1759867887.5139] manager: (tap2f616ccf-0f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/39)
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.523 2 INFO os_vif [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:ac:0d,bridge_name='br-int',has_traffic_filtering=True,id=2f616ccf-0f40-4768-8b90-07dae3707b82,network=Network(872b0f73-e6f1-41ef-b96e-e40b61240904),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f616ccf-0f')#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.525 2 DEBUG nova.network.neutron [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updated VIF entry in instance network info cache for port 9283f59d-4eb5-4e2a-876d-b078582f6dec. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.526 2 DEBUG nova.network.neutron [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updating instance_info_cache with network_info: [{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.547 2 DEBUG oslo_concurrency.lockutils [req-3fe862f0-0cb7-4f26-84e9-b4f1f8c91694 req-21ef8bce-7efc-4d65-a3bd-44bb6e55e962 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.639 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.640 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.640 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] No VIF found with MAC fa:16:3e:d6:ac:0d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.641 2 INFO nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Using config drive#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.840 2 DEBUG nova.network.neutron [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Updated VIF entry in instance network info cache for port 18c97918-cded-43d4-9ccc-3569cc948551. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.841 2 DEBUG nova.network.neutron [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Updating instance_info_cache with network_info: [{"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:27 np0005474864 nova_compute[192593]: 2025-10-07 20:11:27.854 2 DEBUG oslo_concurrency.lockutils [req-3fc04994-529a-4c59-b0aa-66dfc999359e req-e4ca77df-6a4f-486b-8507-222661a43348 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-b00f20b4-40d9-4fe7-8782-20859f161134" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.066 2 DEBUG nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-changed-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.066 2 DEBUG nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Refreshing instance network info cache due to event network-changed-2f616ccf-0f40-4768-8b90-07dae3707b82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.067 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.067 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.067 2 DEBUG nova.network.neutron [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Refreshing network info cache for port 2f616ccf-0f40-4768-8b90-07dae3707b82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.159 2 DEBUG nova.compute.manager [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received event network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.160 2 DEBUG oslo_concurrency.lockutils [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.160 2 DEBUG oslo_concurrency.lockutils [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.160 2 DEBUG oslo_concurrency.lockutils [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.160 2 DEBUG nova.compute.manager [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Processing event network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.161 2 DEBUG nova.compute.manager [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received event network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.161 2 DEBUG oslo_concurrency.lockutils [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.161 2 DEBUG oslo_concurrency.lockutils [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.161 2 DEBUG oslo_concurrency.lockutils [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.161 2 DEBUG nova.compute.manager [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] No waiting events found dispatching network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.162 2 WARNING nova.compute.manager [req-b6201e84-f2bb-4246-beea-236f184346bc req-280d734c-989e-48b7-9daa-bc7f44b221c0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received unexpected event network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.162 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.166 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867888.166098, b00f20b4-40d9-4fe7-8782-20859f161134 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.167 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.171 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.176 2 INFO nova.virt.libvirt.driver [-] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Instance spawned successfully.#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.176 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.183 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.186 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.199 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.200 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.201 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.202 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.203 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.203 2 DEBUG nova.virt.libvirt.driver [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.211 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.227 2 INFO nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Creating config drive at /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk.config#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.237 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplfjkzx1s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.269 2 INFO nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Took 13.54 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.270 2 DEBUG nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.347 2 INFO nova.compute.manager [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Took 14.47 seconds to build instance.#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.365 2 DEBUG oslo_concurrency.processutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmplfjkzx1s" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.381 2 DEBUG oslo_concurrency.lockutils [None req-669f021c-206e-426a-8572-44b59bd7011f fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:28 np0005474864 kernel: tap2f616ccf-0f: entered promiscuous mode
Oct  7 16:11:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:28Z|00049|binding|INFO|Claiming lport 2f616ccf-0f40-4768-8b90-07dae3707b82 for this chassis.
Oct  7 16:11:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:28Z|00050|binding|INFO|2f616ccf-0f40-4768-8b90-07dae3707b82: Claiming fa:16:3e:d6:ac:0d 10.100.0.8
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.426 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:28 np0005474864 NetworkManager[51631]: <info>  [1759867888.4281] manager: (tap2f616ccf-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/40)
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.434 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:ac:0d 10.100.0.8'], port_security=['fa:16:3e:d6:ac:0d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '905ba276-3439-4ffc-9fa7-b8ce71d79b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-872b0f73-e6f1-41ef-b96e-e40b61240904', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb6dd99c8537434e826f505f7c17fb9d', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2ad047e3-0557-4837-bfbc-0bc867d5bdab abd2f5b7-715d-4885-a879-b191e588ae09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a52d61e3-7a73-40b5-91b2-011ed52540be, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=2f616ccf-0f40-4768-8b90-07dae3707b82) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.435 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 2f616ccf-0f40-4768-8b90-07dae3707b82 in datapath 872b0f73-e6f1-41ef-b96e-e40b61240904 bound to our chassis#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.439 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 872b0f73-e6f1-41ef-b96e-e40b61240904#033[00m
Oct  7 16:11:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:28Z|00051|binding|INFO|Setting lport 2f616ccf-0f40-4768-8b90-07dae3707b82 ovn-installed in OVS
Oct  7 16:11:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:28Z|00052|binding|INFO|Setting lport 2f616ccf-0f40-4768-8b90-07dae3707b82 up in Southbound
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.455 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[50d4c675-0a95-4785-8537-24f46bf79c60]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.456 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap872b0f73-e1 in ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.458 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap872b0f73-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.458 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ace79941-5ab3-45eb-8789-c9525ff200fc]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.460 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2baa99aa-6793-4700-8db7-63c96bb965ae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 systemd-udevd[221349]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:11:28 np0005474864 systemd-machined[152586]: New machine qemu-4-instance-0000000c.
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.478 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[0114dc88-d379-42df-b3c8-d372b6a60568]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 systemd[1]: Started Virtual Machine qemu-4-instance-0000000c.
Oct  7 16:11:28 np0005474864 NetworkManager[51631]: <info>  [1759867888.4936] device (tap2f616ccf-0f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:11:28 np0005474864 NetworkManager[51631]: <info>  [1759867888.4947] device (tap2f616ccf-0f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.496 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b4caea72-59b6-4e58-8c67-f2c907d203c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.525 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[d90415a4-f90b-4e91-b13a-be4ca073c0ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 systemd-udevd[221353]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:11:28 np0005474864 NetworkManager[51631]: <info>  [1759867888.5315] manager: (tap872b0f73-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/41)
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.532 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9d15d6fb-553d-4318-9247-fe93ef5bccd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.577 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[831def8b-bdbf-4e7d-bb6d-1001e5f83bff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.582 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[e66ab87f-106f-4548-9b87-fb3d7858eb21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 NetworkManager[51631]: <info>  [1759867888.6158] device (tap872b0f73-e0): carrier: link connected
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.624 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[19be1230-80dd-45c2-a672-0afd828a1b8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.653 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6eceb1c5-10c3-449f-b3bc-bf4f1b5b943e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap872b0f73-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:8d:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352450, 'reachable_time': 42591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221381, 'error': None, 'target': 'ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.679 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a0866960-1f06-4bcc-a501-048ea390ebfc]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1e:8d51'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 352450, 'tstamp': 352450}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221382, 'error': None, 'target': 'ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.703 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6db4e5d9-8a09-4cfc-893b-9a334bf1ba01]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap872b0f73-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1e:8d:51'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352450, 'reachable_time': 42591, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221383, 'error': None, 'target': 'ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.746 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f4559355-6fef-43e5-9f16-eca7af445df7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.825 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0f53e739-2ea6-4151-b6d1-4dca584c0728]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.827 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap872b0f73-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.828 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.828 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap872b0f73-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.860 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:28 np0005474864 NetworkManager[51631]: <info>  [1759867888.8615] manager: (tap872b0f73-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Oct  7 16:11:28 np0005474864 kernel: tap872b0f73-e0: entered promiscuous mode
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.869 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap872b0f73-e0, col_values=(('external_ids', {'iface-id': 'c5ef72cf-6df8-4a8a-8a49-e0344354f772'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:28Z|00053|binding|INFO|Releasing lport c5ef72cf-6df8-4a8a-8a49-e0344354f772 from this chassis (sb_readonly=0)
Oct  7 16:11:28 np0005474864 nova_compute[192593]: 2025-10-07 20:11:28.885 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.889 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/872b0f73-e6f1-41ef-b96e-e40b61240904.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/872b0f73-e6f1-41ef-b96e-e40b61240904.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.891 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[913b22d3-a7fd-4911-b6ae-9fb4269d684b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.892 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-872b0f73-e6f1-41ef-b96e-e40b61240904
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/872b0f73-e6f1-41ef-b96e-e40b61240904.pid.haproxy
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 872b0f73-e6f1-41ef-b96e-e40b61240904
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:11:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:28.897 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904', 'env', 'PROCESS_TAG=haproxy-872b0f73-e6f1-41ef-b96e-e40b61240904', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/872b0f73-e6f1-41ef-b96e-e40b61240904.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:11:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:29Z|00054|binding|INFO|Releasing lport 3ef78041-272c-48f8-bb33-3cc1fdd4d058 from this chassis (sb_readonly=0)
Oct  7 16:11:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:29Z|00055|binding|INFO|Releasing lport c5ef72cf-6df8-4a8a-8a49-e0344354f772 from this chassis (sb_readonly=0)
Oct  7 16:11:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:29Z|00056|binding|INFO|Releasing lport 7ace7da2-42dc-433a-8b4d-8286301cfa0e from this chassis (sb_readonly=0)
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.355 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.357 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.357 2 DEBUG nova.compute.manager [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Going to confirm migration 2 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.390 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867889.3897576, 905ba276-3439-4ffc-9fa7-b8ce71d79b96 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.390 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] VM Started (Lifecycle Event)#033[00m
Oct  7 16:11:29 np0005474864 podman[221419]: 2025-10-07 20:11:29.332116228 +0000 UTC m=+0.037821239 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.428 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.435 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867889.3898711, 905ba276-3439-4ffc-9fa7-b8ce71d79b96 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.436 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.453 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.458 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:11:29 np0005474864 nova_compute[192593]: 2025-10-07 20:11:29.487 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:11:29 np0005474864 podman[221419]: 2025-10-07 20:11:29.696346022 +0000 UTC m=+0.402050973 container create 2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:11:29 np0005474864 systemd[1]: Started libpod-conmon-2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a.scope.
Oct  7 16:11:29 np0005474864 podman[221433]: 2025-10-07 20:11:29.857278911 +0000 UTC m=+0.108763999 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 16:11:29 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:11:29 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/02cad8079826475d1d2a5ea6e0c5fdb8c3156821e255025ea27ca056e5437dc2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:11:29 np0005474864 podman[221435]: 2025-10-07 20:11:29.916584376 +0000 UTC m=+0.170086982 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:11:30 np0005474864 podman[221419]: 2025-10-07 20:11:30.057840698 +0000 UTC m=+0.763545729 container init 2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:11:30 np0005474864 podman[221419]: 2025-10-07 20:11:30.064613103 +0000 UTC m=+0.770318084 container start 2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:11:30 np0005474864 podman[221434]: 2025-10-07 20:11:30.065445817 +0000 UTC m=+0.312454026 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  7 16:11:30 np0005474864 neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904[221493]: [NOTICE]   (221506) : New worker (221508) forked
Oct  7 16:11:30 np0005474864 neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904[221493]: [NOTICE]   (221506) : Loading success.
Oct  7 16:11:31 np0005474864 nova_compute[192593]: 2025-10-07 20:11:31.702 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759867876.7014663, 31cd065b-2fe3-418f-869b-a5ac7f4405f8 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:11:31 np0005474864 nova_compute[192593]: 2025-10-07 20:11:31.703 2 INFO nova.compute.manager [-] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:11:31 np0005474864 nova_compute[192593]: 2025-10-07 20:11:31.821 2 DEBUG nova.compute.manager [None req-08a7fc4d-5dac-4e32-85df-c1bab7e2909d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:31 np0005474864 nova_compute[192593]: 2025-10-07 20:11:31.826 2 DEBUG nova.compute.manager [None req-08a7fc4d-5dac-4e32-85df-c1bab7e2909d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:11:31 np0005474864 nova_compute[192593]: 2025-10-07 20:11:31.871 2 INFO nova.compute.manager [None req-08a7fc4d-5dac-4e32-85df-c1bab7e2909d - - - - - -] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.079 2 DEBUG neutronclient.v2_0.client [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port 9283f59d-4eb5-4e2a-876d-b078582f6dec for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.080 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.081 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquired lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.081 2 DEBUG nova.network.neutron [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.082 2 DEBUG nova.objects.instance [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'info_cache' on Instance uuid 31cd065b-2fe3-418f-869b-a5ac7f4405f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.191 2 DEBUG nova.network.neutron [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updated VIF entry in instance network info cache for port 2f616ccf-0f40-4768-8b90-07dae3707b82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.192 2 DEBUG nova.network.neutron [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updating instance_info_cache with network_info: [{"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.210 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.211 2 DEBUG nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.211 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.211 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.212 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.212 2 DEBUG nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] No waiting events found dispatching network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.212 2 WARNING nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received unexpected event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec for instance with vm_state resized and task_state None.#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.212 2 DEBUG nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.213 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.213 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.213 2 DEBUG oslo_concurrency.lockutils [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.213 2 DEBUG nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] No waiting events found dispatching network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.214 2 WARNING nova.compute.manager [req-0817c83f-132f-4ea9-8209-71b30f3ed584 req-532f4e0c-b565-4853-b32e-760754a74bc4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Received unexpected event network-vif-plugged-9283f59d-4eb5-4e2a-876d-b078582f6dec for instance with vm_state resized and task_state None.#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.338 2 DEBUG nova.compute.manager [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.339 2 DEBUG oslo_concurrency.lockutils [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.340 2 DEBUG oslo_concurrency.lockutils [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.340 2 DEBUG oslo_concurrency.lockutils [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.340 2 DEBUG nova.compute.manager [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Processing event network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.341 2 DEBUG nova.compute.manager [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.341 2 DEBUG oslo_concurrency.lockutils [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.341 2 DEBUG oslo_concurrency.lockutils [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.342 2 DEBUG oslo_concurrency.lockutils [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.342 2 DEBUG nova.compute.manager [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] No waiting events found dispatching network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.342 2 WARNING nova.compute.manager [req-0edf449c-0b1a-497b-a7d3-7bf17a163f0b req-479428e4-f92c-44ef-8478-5c1edf4f7366 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received unexpected event network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.343 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.351 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867892.351633, 905ba276-3439-4ffc-9fa7-b8ce71d79b96 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.352 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.354 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.357 2 INFO nova.virt.libvirt.driver [-] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Instance spawned successfully.#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.358 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.375 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.383 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.386 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.387 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.388 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.388 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.389 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.389 2 DEBUG nova.virt.libvirt.driver [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.426 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.452 2 INFO nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Took 13.22 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.453 2 DEBUG nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.515 2 INFO nova.compute.manager [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Took 13.79 seconds to build instance.#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:11:32 np0005474864 nova_compute[192593]: 2025-10-07 20:11:32.538 2 DEBUG oslo_concurrency.lockutils [None req-6662729f-18e5-41fb-8a21-f0229ce7d973 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.956s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.176 2 DEBUG nova.network.neutron [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 31cd065b-2fe3-418f-869b-a5ac7f4405f8] Updating instance_info_cache with network_info: [{"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.221 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Releasing lock "refresh_cache-31cd065b-2fe3-418f-869b-a5ac7f4405f8" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.222 2 DEBUG nova.objects.instance [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'migration_context' on Instance uuid 31cd065b-2fe3-418f-869b-a5ac7f4405f8 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.252 2 DEBUG nova.virt.libvirt.host [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.253 2 INFO nova.virt.libvirt.host [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] UEFI support detected#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.255 2 DEBUG nova.virt.libvirt.vif [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1391753478',display_name='tempest-TestNetworkAdvancedServerOps-server-1391753478',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-1.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1391753478',id=3,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBHGp9K/m1XQlBJloQMlWOiYAkMHRg/+YyV7EIFeU64B1nJDtz1wGfsQsDxfqhOEvcl/IBS6gweH/4Fue49rFzrh66+jFDwTRyWcSgsUsGaMU3Uma/s2qqLF3+L5vxqg9xw==',key_name='tempest-TestNetworkAdvancedServerOps-1360006664',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:11:26Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-1.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-6hld8jn4',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:11:26Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=31cd065b-2fe3-418f-869b-a5ac7f4405f8,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.256 2 DEBUG nova.network.os_vif_util [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "address": "fa:16:3e:6e:bc:c6", "network": {"id": "3c6f15ee-a1fe-4807-8291-599e41409640", "bridge": "br-int", "label": "tempest-network-smoke--1071304693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.178", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9283f59d-4e", "ovs_interfaceid": "9283f59d-4eb5-4e2a-876d-b078582f6dec", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.257 2 DEBUG nova.network.os_vif_util [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.257 2 DEBUG os_vif [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.260 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9283f59d-4e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.261 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.263 2 INFO os_vif [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6e:bc:c6,bridge_name='br-int',has_traffic_filtering=True,id=9283f59d-4eb5-4e2a-876d-b078582f6dec,network=Network(3c6f15ee-a1fe-4807-8291-599e41409640),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9283f59d-4e')#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.264 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.264 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:34 np0005474864 podman[221517]: 2025-10-07 20:11:34.36451758 +0000 UTC m=+0.061132059 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.377 2 DEBUG nova.compute.provider_tree [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.397 2 DEBUG nova.scheduler.client.report [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.452 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.188s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.583 2 INFO nova.scheduler.client.report [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Deleted allocation for migration 1b763f3f-6a93-4f21-a96f-734cc9bc61be#033[00m
Oct  7 16:11:34 np0005474864 nova_compute[192593]: 2025-10-07 20:11:34.648 2 DEBUG oslo_concurrency.lockutils [None req-ae196619-64ba-4396-8daf-9013d8e491c5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "31cd065b-2fe3-418f-869b-a5ac7f4405f8" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 5.291s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:37 np0005474864 nova_compute[192593]: 2025-10-07 20:11:37.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:38 np0005474864 podman[221536]: 2025-10-07 20:11:38.395756851 +0000 UTC m=+0.076535652 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:11:39 np0005474864 nova_compute[192593]: 2025-10-07 20:11:39.384 2 DEBUG nova.compute.manager [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-changed-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:39 np0005474864 nova_compute[192593]: 2025-10-07 20:11:39.385 2 DEBUG nova.compute.manager [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Refreshing instance network info cache due to event network-changed-2f616ccf-0f40-4768-8b90-07dae3707b82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:11:39 np0005474864 nova_compute[192593]: 2025-10-07 20:11:39.385 2 DEBUG oslo_concurrency.lockutils [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:11:39 np0005474864 nova_compute[192593]: 2025-10-07 20:11:39.385 2 DEBUG oslo_concurrency.lockutils [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:11:39 np0005474864 nova_compute[192593]: 2025-10-07 20:11:39.386 2 DEBUG nova.network.neutron [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Refreshing network info cache for port 2f616ccf-0f40-4768-8b90-07dae3707b82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:11:41 np0005474864 podman[221575]: 2025-10-07 20:11:41.392147221 +0000 UTC m=+0.086699634 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:11:41 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:41Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:c9:ed 10.100.0.27
Oct  7 16:11:41 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:41Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:c9:ed 10.100.0.27
Oct  7 16:11:42 np0005474864 nova_compute[192593]: 2025-10-07 20:11:42.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:11:42 np0005474864 nova_compute[192593]: 2025-10-07 20:11:42.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:11:42 np0005474864 nova_compute[192593]: 2025-10-07 20:11:42.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5004 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:11:42 np0005474864 nova_compute[192593]: 2025-10-07 20:11:42.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:11:42 np0005474864 nova_compute[192593]: 2025-10-07 20:11:42.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:42 np0005474864 nova_compute[192593]: 2025-10-07 20:11:42.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:11:44 np0005474864 nova_compute[192593]: 2025-10-07 20:11:44.336 2 DEBUG nova.network.neutron [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updated VIF entry in instance network info cache for port 2f616ccf-0f40-4768-8b90-07dae3707b82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:11:44 np0005474864 nova_compute[192593]: 2025-10-07 20:11:44.336 2 DEBUG nova.network.neutron [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updating instance_info_cache with network_info: [{"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:44 np0005474864 nova_compute[192593]: 2025-10-07 20:11:44.351 2 DEBUG oslo_concurrency.lockutils [req-2534817c-8aa8-49d5-b07b-e2e69c12d2ed req-1d66fc3c-c319-4bb8-a84f-919b92c62f32 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:11:45 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:45Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d6:ac:0d 10.100.0.8
Oct  7 16:11:45 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:45Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d6:ac:0d 10.100.0.8
Oct  7 16:11:47 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:47.337 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:11:47 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:47.338 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:11:47 np0005474864 nova_compute[192593]: 2025-10-07 20:11:47.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:47 np0005474864 nova_compute[192593]: 2025-10-07 20:11:47.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:47 np0005474864 nova_compute[192593]: 2025-10-07 20:11:47.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.168 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.168 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.169 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.169 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.169 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.170 2 INFO nova.compute.manager [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Terminating instance#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.171 2 DEBUG nova.compute.manager [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:11:51 np0005474864 kernel: tap18c97918-cd (unregistering): left promiscuous mode
Oct  7 16:11:51 np0005474864 NetworkManager[51631]: <info>  [1759867911.1982] device (tap18c97918-cd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:11:51 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:51Z|00057|binding|INFO|Releasing lport 18c97918-cded-43d4-9ccc-3569cc948551 from this chassis (sb_readonly=0)
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:51Z|00058|binding|INFO|Setting lport 18c97918-cded-43d4-9ccc-3569cc948551 down in Southbound
Oct  7 16:11:51 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:51Z|00059|binding|INFO|Removing iface tap18c97918-cd ovn-installed in OVS
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.221 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:c9:ed 10.100.0.27'], port_security=['fa:16:3e:d6:c9:ed 10.100.0.27'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.27/28', 'neutron:device_id': 'b00f20b4-40d9-4fe7-8782-20859f161134', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '4', 'neutron:security_group_ids': '18e2fd14-c88a-4a66-8665-cb12f02155ff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1aac10a9-f2b0-4902-ab46-4b1f397c010b, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=18c97918-cded-43d4-9ccc-3569cc948551) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.224 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 18c97918-cded-43d4-9ccc-3569cc948551 in datapath 116d80b3-97b2-4699-8d30-7d57fd9728fe unbound from our chassis#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.231 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 116d80b3-97b2-4699-8d30-7d57fd9728fe, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.233 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d364e23c-8d20-4937-a96b-b597b4323673]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.234 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe namespace which is not needed anymore#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Deactivated successfully.
Oct  7 16:11:51 np0005474864 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d0000000b.scope: Consumed 14.977s CPU time.
Oct  7 16:11:51 np0005474864 systemd-machined[152586]: Machine qemu-3-instance-0000000b terminated.
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [NOTICE]   (221317) : haproxy version is 2.8.14-c23fe91
Oct  7 16:11:51 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [NOTICE]   (221317) : path to executable is /usr/sbin/haproxy
Oct  7 16:11:51 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [WARNING]  (221317) : Exiting Master process...
Oct  7 16:11:51 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [WARNING]  (221317) : Exiting Master process...
Oct  7 16:11:51 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [ALERT]    (221317) : Current worker (221319) exited with code 143 (Terminated)
Oct  7 16:11:51 np0005474864 neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe[221313]: [WARNING]  (221317) : All workers exited. Exiting... (0)
Oct  7 16:11:51 np0005474864 systemd[1]: libpod-86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7.scope: Deactivated successfully.
Oct  7 16:11:51 np0005474864 podman[221631]: 2025-10-07 20:11:51.466773547 +0000 UTC m=+0.109729277 container died 86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.467 2 INFO nova.virt.libvirt.driver [-] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Instance destroyed successfully.#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.469 2 DEBUG nova.objects.instance [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'resources' on Instance uuid b00f20b4-40d9-4fe7-8782-20859f161134 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.480 2 DEBUG nova.virt.libvirt.vif [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:11:13Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-36228192',display_name='tempest-TestNetworkBasicOps-server-36228192',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-36228192',id=11,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBETfy9GNnTELxCLuS0VSQmy+oeDAuygUBwGUg5UFOaSrJAPx2qJYkaTHuzeoQ/vg5ArfEzqe69+o7Q9PtFBcHv7rv9msY/iszJVgKdNyHhqs7VTb3gngtn9e2I03GN5ceQ==',key_name='tempest-TestNetworkBasicOps-568178070',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:11:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-2kz6cpbc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:11:28Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=b00f20b4-40d9-4fe7-8782-20859f161134,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.481 2 DEBUG nova.network.os_vif_util [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "18c97918-cded-43d4-9ccc-3569cc948551", "address": "fa:16:3e:d6:c9:ed", "network": {"id": "116d80b3-97b2-4699-8d30-7d57fd9728fe", "bridge": "br-int", "label": "tempest-network-smoke--421700815", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.27", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap18c97918-cd", "ovs_interfaceid": "18c97918-cded-43d4-9ccc-3569cc948551", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.482 2 DEBUG nova.network.os_vif_util [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c9:ed,bridge_name='br-int',has_traffic_filtering=True,id=18c97918-cded-43d4-9ccc-3569cc948551,network=Network(116d80b3-97b2-4699-8d30-7d57fd9728fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18c97918-cd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.483 2 DEBUG os_vif [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c9:ed,bridge_name='br-int',has_traffic_filtering=True,id=18c97918-cded-43d4-9ccc-3569cc948551,network=Network(116d80b3-97b2-4699-8d30-7d57fd9728fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18c97918-cd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.485 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.485 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap18c97918-cd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.493 2 INFO os_vif [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d6:c9:ed,bridge_name='br-int',has_traffic_filtering=True,id=18c97918-cded-43d4-9ccc-3569cc948551,network=Network(116d80b3-97b2-4699-8d30-7d57fd9728fe),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap18c97918-cd')#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.494 2 INFO nova.virt.libvirt.driver [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Deleting instance files /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134_del#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.495 2 INFO nova.virt.libvirt.driver [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Deletion of /var/lib/nova/instances/b00f20b4-40d9-4fe7-8782-20859f161134_del complete#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.540 2 INFO nova.compute.manager [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.541 2 DEBUG oslo.service.loopingcall [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.541 2 DEBUG nova.compute.manager [-] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.541 2 DEBUG nova.network.neutron [-] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:11:51 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7-userdata-shm.mount: Deactivated successfully.
Oct  7 16:11:51 np0005474864 systemd[1]: var-lib-containers-storage-overlay-9909fdef655b4d84b97b756481f3c60130c0e5943d49ac4883d2e39fd83b72e7-merged.mount: Deactivated successfully.
Oct  7 16:11:51 np0005474864 podman[221631]: 2025-10-07 20:11:51.709326202 +0000 UTC m=+0.352281922 container cleanup 86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:11:51 np0005474864 systemd[1]: libpod-conmon-86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7.scope: Deactivated successfully.
Oct  7 16:11:51 np0005474864 podman[221676]: 2025-10-07 20:11:51.821607901 +0000 UTC m=+0.068778769 container remove 86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.831 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6d39db-3fc7-4023-83ac-7375431712d1]: (4, ('Tue Oct  7 08:11:51 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe (86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7)\n86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7\nTue Oct  7 08:11:51 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe (86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7)\n86e75a47dc64fcd70cb91696a35e3d2db616fb42a44a1fda1aa919bb6cc2c7d7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.834 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b70b2a4f-110d-41d8-8303-0465e2955ea1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.836 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap116d80b3-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:51 np0005474864 kernel: tap116d80b3-90: left promiscuous mode
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.839 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 nova_compute[192593]: 2025-10-07 20:11:51.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.869 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ca5e4219-c5e3-4a35-b348-ba2a413f4322]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.906 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[df83185c-f318-4082-baf5-8771faaff92d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.907 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[165e4200-8563-47f2-81c2-8990b5dafc34]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.922 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c3c776ae-f479-4111-b03b-056c8ad8b4ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352135, 'reachable_time': 21695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221691, 'error': None, 'target': 'ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.927 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-116d80b3-97b2-4699-8d30-7d57fd9728fe deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:11:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:51.927 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[e277df57-02ef-422f-8c4a-035b0e964c5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:11:51 np0005474864 systemd[1]: run-netns-ovnmeta\x2d116d80b3\x2d97b2\x2d4699\x2d8d30\x2d7d57fd9728fe.mount: Deactivated successfully.
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.557 2 DEBUG nova.compute.manager [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received event network-vif-unplugged-18c97918-cded-43d4-9ccc-3569cc948551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.558 2 DEBUG oslo_concurrency.lockutils [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.559 2 DEBUG oslo_concurrency.lockutils [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.559 2 DEBUG oslo_concurrency.lockutils [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.559 2 DEBUG nova.compute.manager [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] No waiting events found dispatching network-vif-unplugged-18c97918-cded-43d4-9ccc-3569cc948551 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.560 2 DEBUG nova.compute.manager [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received event network-vif-unplugged-18c97918-cded-43d4-9ccc-3569cc948551 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.560 2 DEBUG nova.compute.manager [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received event network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.560 2 DEBUG oslo_concurrency.lockutils [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.561 2 DEBUG oslo_concurrency.lockutils [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.561 2 DEBUG oslo_concurrency.lockutils [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.561 2 DEBUG nova.compute.manager [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] No waiting events found dispatching network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.562 2 WARNING nova.compute.manager [req-4303fd0f-d23a-49f4-820f-1b0b4238d7bb req-4ee3514f-a1f8-467b-9946-c15093e658cf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received unexpected event network-vif-plugged-18c97918-cded-43d4-9ccc-3569cc948551 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:11:52 np0005474864 nova_compute[192593]: 2025-10-07 20:11:52.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:54 np0005474864 nova_compute[192593]: 2025-10-07 20:11:54.868 2 DEBUG nova.network.neutron [-] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:11:54 np0005474864 nova_compute[192593]: 2025-10-07 20:11:54.894 2 INFO nova.compute.manager [-] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Took 3.35 seconds to deallocate network for instance.#033[00m
Oct  7 16:11:54 np0005474864 nova_compute[192593]: 2025-10-07 20:11:54.993 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:11:54 np0005474864 nova_compute[192593]: 2025-10-07 20:11:54.994 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:11:55 np0005474864 nova_compute[192593]: 2025-10-07 20:11:55.105 2 DEBUG nova.compute.provider_tree [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:11:55 np0005474864 nova_compute[192593]: 2025-10-07 20:11:55.124 2 DEBUG nova.compute.manager [req-f4793377-c734-4d63-83be-84b471a2bb8f req-f619d623-46ac-4630-a39d-11a2b7eba5af 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Received event network-vif-deleted-18c97918-cded-43d4-9ccc-3569cc948551 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:11:55 np0005474864 nova_compute[192593]: 2025-10-07 20:11:55.127 2 DEBUG nova.scheduler.client.report [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:11:55 np0005474864 nova_compute[192593]: 2025-10-07 20:11:55.163 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:55 np0005474864 nova_compute[192593]: 2025-10-07 20:11:55.193 2 INFO nova.scheduler.client.report [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Deleted allocations for instance b00f20b4-40d9-4fe7-8782-20859f161134#033[00m
Oct  7 16:11:55 np0005474864 nova_compute[192593]: 2025-10-07 20:11:55.287 2 DEBUG oslo_concurrency.lockutils [None req-2c5e628d-e404-4785-b391-ef0647075c4e fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "b00f20b4-40d9-4fe7-8782-20859f161134" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:11:55 np0005474864 podman[221703]: 2025-10-07 20:11:55.389723363 +0000 UTC m=+0.071173488 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:11:55 np0005474864 podman[221704]: 2025-10-07 20:11:55.414489175 +0000 UTC m=+0.089136315 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, architecture=x86_64, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, release=1755695350)
Oct  7 16:11:56 np0005474864 nova_compute[192593]: 2025-10-07 20:11:56.488 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:11:57.340 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:11:57 np0005474864 nova_compute[192593]: 2025-10-07 20:11:57.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:59 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:59Z|00060|binding|INFO|Releasing lport c5ef72cf-6df8-4a8a-8a49-e0344354f772 from this chassis (sb_readonly=0)
Oct  7 16:11:59 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:59Z|00061|binding|INFO|Releasing lport 7ace7da2-42dc-433a-8b4d-8286301cfa0e from this chassis (sb_readonly=0)
Oct  7 16:11:59 np0005474864 nova_compute[192593]: 2025-10-07 20:11:59.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:11:59 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:59Z|00062|binding|INFO|Releasing lport c5ef72cf-6df8-4a8a-8a49-e0344354f772 from this chassis (sb_readonly=0)
Oct  7 16:11:59 np0005474864 ovn_controller[94801]: 2025-10-07T20:11:59Z|00063|binding|INFO|Releasing lport 7ace7da2-42dc-433a-8b4d-8286301cfa0e from this chassis (sb_readonly=0)
Oct  7 16:11:59 np0005474864 nova_compute[192593]: 2025-10-07 20:11:59.718 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:00 np0005474864 podman[221745]: 2025-10-07 20:12:00.40702806 +0000 UTC m=+0.092156752 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=iscsid, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  7 16:12:00 np0005474864 podman[221747]: 2025-10-07 20:12:00.430589927 +0000 UTC m=+0.105049102 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:12:00 np0005474864 podman[221746]: 2025-10-07 20:12:00.466294154 +0000 UTC m=+0.153877476 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:12:00 np0005474864 nova_compute[192593]: 2025-10-07 20:12:00.973 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:00 np0005474864 nova_compute[192593]: 2025-10-07 20:12:00.974 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:00 np0005474864 nova_compute[192593]: 2025-10-07 20:12:00.974 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:00 np0005474864 nova_compute[192593]: 2025-10-07 20:12:00.975 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:00 np0005474864 nova_compute[192593]: 2025-10-07 20:12:00.975 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:00 np0005474864 nova_compute[192593]: 2025-10-07 20:12:00.978 2 INFO nova.compute.manager [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Terminating instance#033[00m
Oct  7 16:12:00 np0005474864 nova_compute[192593]: 2025-10-07 20:12:00.980 2 DEBUG nova.compute.manager [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:12:01 np0005474864 kernel: tap6ac9cc3d-50 (unregistering): left promiscuous mode
Oct  7 16:12:01 np0005474864 NetworkManager[51631]: <info>  [1759867921.0307] device (tap6ac9cc3d-50): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:01Z|00064|binding|INFO|Releasing lport 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b from this chassis (sb_readonly=0)
Oct  7 16:12:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:01Z|00065|binding|INFO|Setting lport 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b down in Southbound
Oct  7 16:12:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:01Z|00066|binding|INFO|Removing iface tap6ac9cc3d-50 ovn-installed in OVS
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:01.056 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5d:0b:e9 10.100.0.11'], port_security=['fa:16:3e:5d:0b:e9 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '3aa55e8a-0c2d-4f7b-aac0-c393e35ec679', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a9053617-1148-4139-a949-8321e760481f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '4', 'neutron:security_group_ids': '827ff091-5676-4b1d-8d1c-d3af6f7c6fff', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a890b16f-fa51-4e24-8f2d-bf0ff593911f, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:12:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:01.058 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b in datapath a9053617-1148-4139-a949-8321e760481f unbound from our chassis#033[00m
Oct  7 16:12:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:01.062 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a9053617-1148-4139-a949-8321e760481f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:12:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:01.064 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ab3ba715-a983-4b98-bc04-e9cd88208816]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:01.065 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a9053617-1148-4139-a949-8321e760481f namespace which is not needed anymore#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Oct  7 16:12:01 np0005474864 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 16.414s CPU time.
Oct  7 16:12:01 np0005474864 systemd-machined[152586]: Machine qemu-2-instance-00000002 terminated.
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.168 2 DEBUG nova.compute.manager [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-changed-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.170 2 DEBUG nova.compute.manager [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Refreshing instance network info cache due to event network-changed-2f616ccf-0f40-4768-8b90-07dae3707b82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.171 2 DEBUG oslo_concurrency.lockutils [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.171 2 DEBUG oslo_concurrency.lockutils [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.171 2 DEBUG nova.network.neutron [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Refreshing network info cache for port 2f616ccf-0f40-4768-8b90-07dae3707b82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.259 2 INFO nova.virt.libvirt.driver [-] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Instance destroyed successfully.#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.260 2 DEBUG nova.objects.instance [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'resources' on Instance uuid 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.275 2 DEBUG nova.virt.libvirt.vif [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:10:03Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1247597446',display_name='tempest-TestNetworkBasicOps-server-1247597446',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1247597446',id=2,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEYMbYS6FykH8XwBy+xJhBj+DroVNsweQH8/yrZNy2DdnGZ8U7ITpQhcHiv47cPc+C9zUx61bpZMf9xi1jJuzMTLouiDNZyx+sOCB1Md+ZKzM9kBZzk7412n9ZJH/ZBT6Q==',key_name='tempest-TestNetworkBasicOps-1328202832',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:10:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-9t6cko4x',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:10:40Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=3aa55e8a-0c2d-4f7b-aac0-c393e35ec679,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.276 2 DEBUG nova.network.os_vif_util [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.277 2 DEBUG nova.network.os_vif_util [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5d:0b:e9,bridge_name='br-int',has_traffic_filtering=True,id=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b,network=Network(a9053617-1148-4139-a949-8321e760481f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac9cc3d-50') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.277 2 DEBUG os_vif [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:0b:e9,bridge_name='br-int',has_traffic_filtering=True,id=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b,network=Network(a9053617-1148-4139-a949-8321e760481f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac9cc3d-50') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.279 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.280 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ac9cc3d-50, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.284 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.288 2 INFO os_vif [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5d:0b:e9,bridge_name='br-int',has_traffic_filtering=True,id=6ac9cc3d-5039-41fa-a966-ec61d9e9c38b,network=Network(a9053617-1148-4139-a949-8321e760481f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ac9cc3d-50')#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.288 2 INFO nova.virt.libvirt.driver [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Deleting instance files /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679_del#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.289 2 INFO nova.virt.libvirt.driver [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Deletion of /var/lib/nova/instances/3aa55e8a-0c2d-4f7b-aac0-c393e35ec679_del complete#033[00m
Oct  7 16:12:01 np0005474864 neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f[220620]: [NOTICE]   (220624) : haproxy version is 2.8.14-c23fe91
Oct  7 16:12:01 np0005474864 neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f[220620]: [NOTICE]   (220624) : path to executable is /usr/sbin/haproxy
Oct  7 16:12:01 np0005474864 neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f[220620]: [WARNING]  (220624) : Exiting Master process...
Oct  7 16:12:01 np0005474864 neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f[220620]: [ALERT]    (220624) : Current worker (220626) exited with code 143 (Terminated)
Oct  7 16:12:01 np0005474864 neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f[220620]: [WARNING]  (220624) : All workers exited. Exiting... (0)
Oct  7 16:12:01 np0005474864 systemd[1]: libpod-7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa.scope: Deactivated successfully.
Oct  7 16:12:01 np0005474864 podman[221831]: 2025-10-07 20:12:01.320712256 +0000 UTC m=+0.125206942 container died 7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.325 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.325 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.326 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.327 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.327 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.329 2 INFO nova.compute.manager [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Terminating instance#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.331 2 DEBUG nova.compute.manager [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.352 2 INFO nova.compute.manager [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.352 2 DEBUG oslo.service.loopingcall [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.353 2 DEBUG nova.compute.manager [-] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.353 2 DEBUG nova.network.neutron [-] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:12:01 np0005474864 kernel: tap2f616ccf-0f (unregistering): left promiscuous mode
Oct  7 16:12:01 np0005474864 NetworkManager[51631]: <info>  [1759867921.3674] device (tap2f616ccf-0f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.441 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:01Z|00067|binding|INFO|Releasing lport 2f616ccf-0f40-4768-8b90-07dae3707b82 from this chassis (sb_readonly=0)
Oct  7 16:12:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:01Z|00068|binding|INFO|Setting lport 2f616ccf-0f40-4768-8b90-07dae3707b82 down in Southbound
Oct  7 16:12:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:01Z|00069|binding|INFO|Removing iface tap2f616ccf-0f ovn-installed in OVS
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:01.452 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d6:ac:0d 10.100.0.8'], port_security=['fa:16:3e:d6:ac:0d 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '905ba276-3439-4ffc-9fa7-b8ce71d79b96', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-872b0f73-e6f1-41ef-b96e-e40b61240904', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'eb6dd99c8537434e826f505f7c17fb9d', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2ad047e3-0557-4837-bfbc-0bc867d5bdab abd2f5b7-715d-4885-a879-b191e588ae09', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a52d61e3-7a73-40b5-91b2-011ed52540be, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=2f616ccf-0f40-4768-8b90-07dae3707b82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Oct  7 16:12:01 np0005474864 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000c.scope: Consumed 13.832s CPU time.
Oct  7 16:12:01 np0005474864 systemd-machined[152586]: Machine qemu-4-instance-0000000c terminated.
Oct  7 16:12:01 np0005474864 NetworkManager[51631]: <info>  [1759867921.5606] manager: (tap2f616ccf-0f): new Tun device (/org/freedesktop/NetworkManager/Devices/43)
Oct  7 16:12:01 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa-userdata-shm.mount: Deactivated successfully.
Oct  7 16:12:01 np0005474864 systemd[1]: var-lib-containers-storage-overlay-0915cbe3cc260c42bf96e2db68778e8e1d546d3b4e68292d6f4238a617767750-merged.mount: Deactivated successfully.
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.620 2 INFO nova.virt.libvirt.driver [-] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Instance destroyed successfully.#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.621 2 DEBUG nova.objects.instance [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lazy-loading 'resources' on Instance uuid 905ba276-3439-4ffc-9fa7-b8ce71d79b96 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.634 2 DEBUG nova.virt.libvirt.vif [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:11:17Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1096783716-access_point-474518308',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1096783716-access_point-474518308',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1096783716-ac',id=12,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGDkqxmMN+Aruikf/t9ZJtlDoMcoyfxz6sKwIlxv989FdnDhDVp3Dd9igEHauIxEyymp1djFrT/aIF60PZHGcUp63X92MHlpEGvJzVqfsnrqYyF8hZ+ZfmxbiJh7Y+ZIOQ==',key_name='tempest-TestSecurityGroupsBasicOps-1494070395',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:11:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='eb6dd99c8537434e826f505f7c17fb9d',ramdisk_id='',reservation_id='r-8x4964bw',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1096783716',owner_user_name='tempest-TestSecurityGroupsBasicOps-1096783716-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:11:32Z,user_data=None,user_id='c01210eeec574b6e98f04c03e858c140',uuid=905ba276-3439-4ffc-9fa7-b8ce71d79b96,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.635 2 DEBUG nova.network.os_vif_util [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Converting VIF {"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.208", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.635 2 DEBUG nova.network.os_vif_util [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d6:ac:0d,bridge_name='br-int',has_traffic_filtering=True,id=2f616ccf-0f40-4768-8b90-07dae3707b82,network=Network(872b0f73-e6f1-41ef-b96e-e40b61240904),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f616ccf-0f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.636 2 DEBUG os_vif [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:ac:0d,bridge_name='br-int',has_traffic_filtering=True,id=2f616ccf-0f40-4768-8b90-07dae3707b82,network=Network(872b0f73-e6f1-41ef-b96e-e40b61240904),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f616ccf-0f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.637 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap2f616ccf-0f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.643 2 INFO os_vif [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d6:ac:0d,bridge_name='br-int',has_traffic_filtering=True,id=2f616ccf-0f40-4768-8b90-07dae3707b82,network=Network(872b0f73-e6f1-41ef-b96e-e40b61240904),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap2f616ccf-0f')#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.643 2 INFO nova.virt.libvirt.driver [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Deleting instance files /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96_del#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.644 2 INFO nova.virt.libvirt.driver [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Deletion of /var/lib/nova/instances/905ba276-3439-4ffc-9fa7-b8ce71d79b96_del complete#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.692 2 INFO nova.compute.manager [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.693 2 DEBUG oslo.service.loopingcall [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.693 2 DEBUG nova.compute.manager [-] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:12:01 np0005474864 nova_compute[192593]: 2025-10-07 20:12:01.693 2 DEBUG nova.network.neutron [-] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:12:01 np0005474864 podman[221831]: 2025-10-07 20:12:01.872959158 +0000 UTC m=+0.677453814 container cleanup 7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:12:01 np0005474864 systemd[1]: libpod-conmon-7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa.scope: Deactivated successfully.
Oct  7 16:12:02 np0005474864 podman[221896]: 2025-10-07 20:12:02.166693615 +0000 UTC m=+0.256738604 container remove 7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.177 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6f576c67-1662-4ac2-8c22-8814186cdd21]: (4, ('Tue Oct  7 08:12:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f (7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa)\n7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa\nTue Oct  7 08:12:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a9053617-1148-4139-a949-8321e760481f (7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa)\n7c81144c8b8c0111e73e7834cb0feef96b4106ce0381f967bf0cb1518e112efa\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.181 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c74ed5c6-747e-4197-820a-4a407b7f986d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.182 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa9053617-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:02 np0005474864 kernel: tapa9053617-10: left promiscuous mode
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.185 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.209 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[627dda60-6f9c-4e3a-a192-e5ccd1a86327]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.235 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3fae81de-868a-4c78-8176-314b1e193b6c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.237 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[08930365-efe4-40aa-ab11-3e5cb816c83f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.264 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[75a53a33-3481-40a8-b400-62849c6910d2]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 345503, 'reachable_time': 37811, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221911, 'error': None, 'target': 'ovnmeta-a9053617-1148-4139-a949-8321e760481f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.267 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a9053617-1148-4139-a949-8321e760481f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.267 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[570f8b65-c0aa-4818-802c-8c49e04d309d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.268 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 2f616ccf-0f40-4768-8b90-07dae3707b82 in datapath 872b0f73-e6f1-41ef-b96e-e40b61240904 unbound from our chassis#033[00m
Oct  7 16:12:02 np0005474864 systemd[1]: run-netns-ovnmeta\x2da9053617\x2d1148\x2d4139\x2da949\x2d8321e760481f.mount: Deactivated successfully.
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.270 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 872b0f73-e6f1-41ef-b96e-e40b61240904, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.271 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ade74611-2fcf-46d1-bb7e-9db21023b777]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:02.272 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904 namespace which is not needed anymore#033[00m
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.324 2 DEBUG nova.compute.manager [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-changed-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.325 2 DEBUG nova.compute.manager [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Refreshing instance network info cache due to event network-changed-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.325 2 DEBUG oslo_concurrency.lockutils [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.326 2 DEBUG oslo_concurrency.lockutils [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.326 2 DEBUG nova.network.neutron [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Refreshing network info cache for port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:12:02 np0005474864 neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904[221493]: [NOTICE]   (221506) : haproxy version is 2.8.14-c23fe91
Oct  7 16:12:02 np0005474864 neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904[221493]: [NOTICE]   (221506) : path to executable is /usr/sbin/haproxy
Oct  7 16:12:02 np0005474864 neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904[221493]: [WARNING]  (221506) : Exiting Master process...
Oct  7 16:12:02 np0005474864 neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904[221493]: [ALERT]    (221506) : Current worker (221508) exited with code 143 (Terminated)
Oct  7 16:12:02 np0005474864 neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904[221493]: [WARNING]  (221506) : All workers exited. Exiting... (0)
Oct  7 16:12:02 np0005474864 systemd[1]: libpod-2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a.scope: Deactivated successfully.
Oct  7 16:12:02 np0005474864 podman[221929]: 2025-10-07 20:12:02.478346708 +0000 UTC m=+0.078042856 container died 2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:12:02 np0005474864 nova_compute[192593]: 2025-10-07 20:12:02.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:02 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a-userdata-shm.mount: Deactivated successfully.
Oct  7 16:12:02 np0005474864 systemd[1]: var-lib-containers-storage-overlay-02cad8079826475d1d2a5ea6e0c5fdb8c3156821e255025ea27ca056e5437dc2-merged.mount: Deactivated successfully.
Oct  7 16:12:02 np0005474864 podman[221929]: 2025-10-07 20:12:02.997021134 +0000 UTC m=+0.596717292 container cleanup 2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:12:03 np0005474864 systemd[1]: libpod-conmon-2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a.scope: Deactivated successfully.
Oct  7 16:12:03 np0005474864 podman[221960]: 2025-10-07 20:12:03.172853211 +0000 UTC m=+0.137720632 container remove 2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.182 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[eb55b06c-697e-4d2a-bcdd-c1a157ae2350]: (4, ('Tue Oct  7 08:12:02 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904 (2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a)\n2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a\nTue Oct  7 08:12:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904 (2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a)\n2f525620a1c0a06006bf38c8fb41a8f515ffc5ce4a99e089e702050afe62fa5a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.184 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e1c41875-f2f9-48a3-a079-23cf1e43f400]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.185 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap872b0f73-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:03 np0005474864 nova_compute[192593]: 2025-10-07 20:12:03.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:03 np0005474864 kernel: tap872b0f73-e0: left promiscuous mode
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.192 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d512edeb-830b-486a-92cf-db65d8c56dd1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:03 np0005474864 nova_compute[192593]: 2025-10-07 20:12:03.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.231 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f4031fe9-0ac3-4ab6-a882-4a2d1eae91e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.233 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f6547c63-9f95-4012-a2e2-caa4801781d6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.257 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[81f617bd-b8d9-4c37-828f-c16f5aac1c63]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 352441, 'reachable_time': 22938, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221976, 'error': None, 'target': 'ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:03 np0005474864 systemd[1]: run-netns-ovnmeta\x2d872b0f73\x2de6f1\x2d41ef\x2db96e\x2de40b61240904.mount: Deactivated successfully.
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.260 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-872b0f73-e6f1-41ef-b96e-e40b61240904 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:12:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:03.260 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[02f85248-8275-454b-82d5-73f6e3da4ccf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:03 np0005474864 nova_compute[192593]: 2025-10-07 20:12:03.835 2 DEBUG nova.network.neutron [-] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:03 np0005474864 nova_compute[192593]: 2025-10-07 20:12:03.858 2 INFO nova.compute.manager [-] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Took 2.50 seconds to deallocate network for instance.#033[00m
Oct  7 16:12:03 np0005474864 nova_compute[192593]: 2025-10-07 20:12:03.904 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:03 np0005474864 nova_compute[192593]: 2025-10-07 20:12:03.905 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.008 2 DEBUG nova.compute.provider_tree [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.039 2 DEBUG nova.scheduler.client.report [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.082 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.123 2 INFO nova.scheduler.client.report [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Deleted allocations for instance 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.129 2 DEBUG nova.network.neutron [-] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.150 2 INFO nova.compute.manager [-] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Took 2.46 seconds to deallocate network for instance.#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.212 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.213 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.215 2 DEBUG oslo_concurrency.lockutils [None req-38ab8746-719c-410c-8089-ce532bed55f1 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.261 2 DEBUG nova.compute.provider_tree [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.277 2 DEBUG nova.scheduler.client.report [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.298 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.085s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.328 2 INFO nova.scheduler.client.report [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Deleted allocations for instance 905ba276-3439-4ffc-9fa7-b8ce71d79b96#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.348 2 DEBUG nova.network.neutron [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updated VIF entry in instance network info cache for port 2f616ccf-0f40-4768-8b90-07dae3707b82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.349 2 DEBUG nova.network.neutron [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Updating instance_info_cache with network_info: [{"id": "2f616ccf-0f40-4768-8b90-07dae3707b82", "address": "fa:16:3e:d6:ac:0d", "network": {"id": "872b0f73-e6f1-41ef-b96e-e40b61240904", "bridge": "br-int", "label": "tempest-network-smoke--469697660", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "eb6dd99c8537434e826f505f7c17fb9d", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap2f616ccf-0f", "ovs_interfaceid": "2f616ccf-0f40-4768-8b90-07dae3707b82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.395 2 DEBUG oslo_concurrency.lockutils [None req-5ae9ccfb-2b4b-4222-9361-5537dec22e03 c01210eeec574b6e98f04c03e858c140 eb6dd99c8537434e826f505f7c17fb9d - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.070s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.397 2 DEBUG oslo_concurrency.lockutils [req-3ab99b88-588a-4bd9-83d1-e95821e227b9 req-6c5439f9-f3a6-4bac-9fa3-e3da5a363657 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-905ba276-3439-4ffc-9fa7-b8ce71d79b96" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.604 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-vif-unplugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.604 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.605 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.605 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.605 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] No waiting events found dispatching network-vif-unplugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.605 2 WARNING nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received unexpected event network-vif-unplugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.606 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.606 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.606 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.607 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "3aa55e8a-0c2d-4f7b-aac0-c393e35ec679-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.607 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] No waiting events found dispatching network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.607 2 WARNING nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received unexpected event network-vif-plugged-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.607 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-vif-unplugged-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.608 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.608 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.608 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.609 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] No waiting events found dispatching network-vif-unplugged-2f616ccf-0f40-4768-8b90-07dae3707b82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.609 2 WARNING nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received unexpected event network-vif-unplugged-2f616ccf-0f40-4768-8b90-07dae3707b82 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.609 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Received event network-vif-deleted-6ac9cc3d-5039-41fa-a966-ec61d9e9c38b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.610 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.610 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.610 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.610 2 DEBUG oslo_concurrency.lockutils [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "905ba276-3439-4ffc-9fa7-b8ce71d79b96-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.611 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] No waiting events found dispatching network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.611 2 WARNING nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received unexpected event network-vif-plugged-2f616ccf-0f40-4768-8b90-07dae3707b82 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.611 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Received event network-vif-deleted-2f616ccf-0f40-4768-8b90-07dae3707b82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.612 2 INFO nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Neutron deleted interface 2f616ccf-0f40-4768-8b90-07dae3707b82; detaching it from the instance and deleting it from the info cache#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.612 2 DEBUG nova.network.neutron [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Instance is deleted, no further info cache update update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:106#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.614 2 DEBUG nova.compute.manager [req-802a1a41-b6dc-4a81-ab07-6ac4635d07b7 req-51945dd0-1874-4ec6-8176-b9fcdcb1b3dd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Detach interface failed, port_id=2f616ccf-0f40-4768-8b90-07dae3707b82, reason: Instance 905ba276-3439-4ffc-9fa7-b8ce71d79b96 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.759 2 DEBUG nova.network.neutron [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updated VIF entry in instance network info cache for port 6ac9cc3d-5039-41fa-a966-ec61d9e9c38b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.760 2 DEBUG nova.network.neutron [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Updating instance_info_cache with network_info: [{"id": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "address": "fa:16:3e:5d:0b:e9", "network": {"id": "a9053617-1148-4139-a949-8321e760481f", "bridge": "br-int", "label": "tempest-network-smoke--1784539601", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ac9cc3d-50", "ovs_interfaceid": "6ac9cc3d-5039-41fa-a966-ec61d9e9c38b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.791 2 DEBUG oslo_concurrency.lockutils [req-4972f7e0-2877-4498-9596-6f9dc23bf0f2 req-6128dd7e-3c7f-45d7-9c46-dc0e05f69afb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-3aa55e8a-0c2d-4f7b-aac0-c393e35ec679" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:12:04 np0005474864 nova_compute[192593]: 2025-10-07 20:12:04.937 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:05 np0005474864 nova_compute[192593]: 2025-10-07 20:12:05.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:05 np0005474864 podman[221978]: 2025-10-07 20:12:05.389380543 +0000 UTC m=+0.078674283 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:12:06 np0005474864 nova_compute[192593]: 2025-10-07 20:12:06.465 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759867911.463465, b00f20b4-40d9-4fe7-8782-20859f161134 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:06 np0005474864 nova_compute[192593]: 2025-10-07 20:12:06.465 2 INFO nova.compute.manager [-] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:12:06 np0005474864 nova_compute[192593]: 2025-10-07 20:12:06.483 2 DEBUG nova.compute.manager [None req-d9905517-6a64-4e50-a3b5-796a616435eb - - - - - -] [instance: b00f20b4-40d9-4fe7-8782-20859f161134] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:06 np0005474864 nova_compute[192593]: 2025-10-07 20:12:06.639 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:07 np0005474864 nova_compute[192593]: 2025-10-07 20:12:07.591 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:09 np0005474864 podman[221999]: 2025-10-07 20:12:09.385208246 +0000 UTC m=+0.079372683 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:12:11 np0005474864 nova_compute[192593]: 2025-10-07 20:12:11.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:12 np0005474864 podman[222024]: 2025-10-07 20:12:12.384062957 +0000 UTC m=+0.072702742 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:12:12 np0005474864 nova_compute[192593]: 2025-10-07 20:12:12.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:16.185 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:16.186 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:16.186 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:16 np0005474864 nova_compute[192593]: 2025-10-07 20:12:16.258 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759867921.2566562, 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:16 np0005474864 nova_compute[192593]: 2025-10-07 20:12:16.259 2 INFO nova.compute.manager [-] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:12:16 np0005474864 nova_compute[192593]: 2025-10-07 20:12:16.285 2 DEBUG nova.compute.manager [None req-5bb0821b-bd96-417e-ad57-64ef693874e7 - - - - - -] [instance: 3aa55e8a-0c2d-4f7b-aac0-c393e35ec679] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:16 np0005474864 nova_compute[192593]: 2025-10-07 20:12:16.618 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759867921.6170273, 905ba276-3439-4ffc-9fa7-b8ce71d79b96 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:16 np0005474864 nova_compute[192593]: 2025-10-07 20:12:16.619 2 INFO nova.compute.manager [-] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:12:16 np0005474864 nova_compute[192593]: 2025-10-07 20:12:16.642 2 DEBUG nova.compute.manager [None req-77b17e32-c3d0-4eff-b219-2c80d460ea8a - - - - - -] [instance: 905ba276-3439-4ffc-9fa7-b8ce71d79b96] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:16 np0005474864 nova_compute[192593]: 2025-10-07 20:12:16.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:17 np0005474864 nova_compute[192593]: 2025-10-07 20:12:17.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:17 np0005474864 nova_compute[192593]: 2025-10-07 20:12:17.596 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:18 np0005474864 nova_compute[192593]: 2025-10-07 20:12:18.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.136 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.137 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.137 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.137 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.404 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.406 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5754MB free_disk=73.46788787841797GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.406 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.406 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.478 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.479 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.505 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.525 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.553 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:12:19 np0005474864 nova_compute[192593]: 2025-10-07 20:12:19.553 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:21 np0005474864 nova_compute[192593]: 2025-10-07 20:12:21.548 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:21 np0005474864 nova_compute[192593]: 2025-10-07 20:12:21.549 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:21 np0005474864 nova_compute[192593]: 2025-10-07 20:12:21.549 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:21 np0005474864 nova_compute[192593]: 2025-10-07 20:12:21.645 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:22 np0005474864 nova_compute[192593]: 2025-10-07 20:12:22.598 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:23 np0005474864 nova_compute[192593]: 2025-10-07 20:12:23.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:23 np0005474864 nova_compute[192593]: 2025-10-07 20:12:23.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:12:23 np0005474864 nova_compute[192593]: 2025-10-07 20:12:23.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:12:23 np0005474864 nova_compute[192593]: 2025-10-07 20:12:23.128 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:12:23 np0005474864 nova_compute[192593]: 2025-10-07 20:12:23.129 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:23 np0005474864 nova_compute[192593]: 2025-10-07 20:12:23.129 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:12:23 np0005474864 nova_compute[192593]: 2025-10-07 20:12:23.130 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:12:26 np0005474864 podman[222048]: 2025-10-07 20:12:26.394997863 +0000 UTC m=+0.084406309 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:12:26 np0005474864 podman[222049]: 2025-10-07 20:12:26.424438779 +0000 UTC m=+0.103060365 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, version=9.6, release=1755695350, vendor=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=)
Oct  7 16:12:26 np0005474864 nova_compute[192593]: 2025-10-07 20:12:26.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:27 np0005474864 nova_compute[192593]: 2025-10-07 20:12:27.599 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:12:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:12:31 np0005474864 podman[222093]: 2025-10-07 20:12:31.407695569 +0000 UTC m=+0.102902531 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=ovn_controller, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:12:31 np0005474864 podman[222092]: 2025-10-07 20:12:31.411081736 +0000 UTC m=+0.100621435 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct  7 16:12:31 np0005474864 podman[222094]: 2025-10-07 20:12:31.413516166 +0000 UTC m=+0.091844892 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:12:31 np0005474864 nova_compute[192593]: 2025-10-07 20:12:31.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:32 np0005474864 nova_compute[192593]: 2025-10-07 20:12:32.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.657 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.657 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.676 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.760 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.761 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.772 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.773 2 INFO nova.compute.claims [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.880 2 DEBUG nova.compute.provider_tree [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.903 2 DEBUG nova.scheduler.client.report [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.934 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.935 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.993 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:12:34 np0005474864 nova_compute[192593]: 2025-10-07 20:12:34.994 2 DEBUG nova.network.neutron [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.020 2 INFO nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.047 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.153 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.155 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.156 2 INFO nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Creating image(s)#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.157 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.157 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.159 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.185 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.272 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.274 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.274 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.286 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.355 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.356 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.390 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.391 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.117s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.391 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.460 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.462 2 DEBUG nova.virt.disk.api [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Checking if we can resize image /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.462 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.521 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.522 2 DEBUG nova.virt.disk.api [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Cannot resize image /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.523 2 DEBUG nova.objects.instance [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'migration_context' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.555 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.556 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Ensure instance console log exists: /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.556 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.557 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.557 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:35 np0005474864 nova_compute[192593]: 2025-10-07 20:12:35.801 2 DEBUG nova.policy [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:12:36 np0005474864 podman[222171]: 2025-10-07 20:12:36.395763635 +0000 UTC m=+0.084982284 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:12:36 np0005474864 nova_compute[192593]: 2025-10-07 20:12:36.651 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:37 np0005474864 nova_compute[192593]: 2025-10-07 20:12:37.638 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:38 np0005474864 nova_compute[192593]: 2025-10-07 20:12:38.192 2 DEBUG nova.network.neutron [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Successfully created port: 63b103d2-ef83-41aa-9080-3137adafe387 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.335 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.335 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.357 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:12:40 np0005474864 podman[222190]: 2025-10-07 20:12:40.405436284 +0000 UTC m=+0.057714990 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.455 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.456 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.476 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.476 2 INFO nova.compute.claims [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.515 2 DEBUG nova.network.neutron [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Successfully updated port: 63b103d2-ef83-41aa-9080-3137adafe387 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.536 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.537 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.537 2 DEBUG nova.network.neutron [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.641 2 DEBUG nova.compute.provider_tree [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.663 2 DEBUG nova.scheduler.client.report [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.702 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.704 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.770 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.771 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.797 2 INFO nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.816 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.847 2 DEBUG nova.network.neutron [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.908 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.910 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.910 2 INFO nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Creating image(s)#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.911 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "/var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.911 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.913 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:40 np0005474864 nova_compute[192593]: 2025-10-07 20:12:40.939 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.041 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.102s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.043 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.044 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.068 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.137 2 DEBUG nova.policy [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.143 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.144 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.334 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk 1073741824" returned: 0 in 0.190s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.335 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.292s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.336 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.422 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.423 2 DEBUG nova.virt.disk.api [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Checking if we can resize image /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.424 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.509 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.511 2 DEBUG nova.virt.disk.api [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Cannot resize image /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.512 2 DEBUG nova.objects.instance [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'migration_context' on Instance uuid 1669315a-9455-4ddc-bddf-b5a535be9294 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.532 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.532 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Ensure instance console log exists: /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.533 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.534 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.534 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:41 np0005474864 nova_compute[192593]: 2025-10-07 20:12:41.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.640 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.648 2 DEBUG nova.compute.manager [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-changed-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.649 2 DEBUG nova.compute.manager [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing instance network info cache due to event network-changed-63b103d2-ef83-41aa-9080-3137adafe387. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.649 2 DEBUG oslo_concurrency.lockutils [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.928 2 DEBUG nova.network.neutron [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.948 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.948 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Instance network_info: |[{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.949 2 DEBUG oslo_concurrency.lockutils [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.950 2 DEBUG nova.network.neutron [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing network info cache for port 63b103d2-ef83-41aa-9080-3137adafe387 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.954 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Start _get_guest_xml network_info=[{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.960 2 WARNING nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.969 2 DEBUG nova.virt.libvirt.host [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.970 2 DEBUG nova.virt.libvirt.host [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.975 2 DEBUG nova.virt.libvirt.host [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.975 2 DEBUG nova.virt.libvirt.host [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.977 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.977 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.978 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.978 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.978 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.978 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.979 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.979 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.979 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.980 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.980 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.980 2 DEBUG nova.virt.hardware [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.985 2 DEBUG nova.virt.libvirt.vif [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:12:35Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.985 2 DEBUG nova.network.os_vif_util [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.986 2 DEBUG nova.network.os_vif_util [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:7b:b2,bridge_name='br-int',has_traffic_filtering=True,id=63b103d2-ef83-41aa-9080-3137adafe387,network=Network(b153978d-a2d5-4c7d-8ff5-8249927e8e0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63b103d2-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:42 np0005474864 nova_compute[192593]: 2025-10-07 20:12:42.987 2 DEBUG nova.objects.instance [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.001 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <uuid>f2a4ae00-d828-4178-880f-cc034629d96e</uuid>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <name>instance-0000000f</name>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:12:42</nova:creationTime>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <entry name="serial">f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <entry name="uuid">f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:f0:7b:b2"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <target dev="tap63b103d2-ef"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log" append="off"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:12:43 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:12:43 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:12:43 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:12:43 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.003 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Preparing to wait for external event network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.004 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.004 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.004 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.006 2 DEBUG nova.virt.libvirt.vif [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:12:35Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.006 2 DEBUG nova.network.os_vif_util [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.007 2 DEBUG nova.network.os_vif_util [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f0:7b:b2,bridge_name='br-int',has_traffic_filtering=True,id=63b103d2-ef83-41aa-9080-3137adafe387,network=Network(b153978d-a2d5-4c7d-8ff5-8249927e8e0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63b103d2-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.008 2 DEBUG os_vif [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:7b:b2,bridge_name='br-int',has_traffic_filtering=True,id=63b103d2-ef83-41aa-9080-3137adafe387,network=Network(b153978d-a2d5-4c7d-8ff5-8249927e8e0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63b103d2-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.009 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.010 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.015 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63b103d2-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.016 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63b103d2-ef, col_values=(('external_ids', {'iface-id': '63b103d2-ef83-41aa-9080-3137adafe387', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f0:7b:b2', 'vm-uuid': 'f2a4ae00-d828-4178-880f-cc034629d96e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:12:43 np0005474864 NetworkManager[51631]: <info>  [1759867963.0221] manager: (tap63b103d2-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/44)
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.031 2 INFO os_vif [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f0:7b:b2,bridge_name='br-int',has_traffic_filtering=True,id=63b103d2-ef83-41aa-9080-3137adafe387,network=Network(b153978d-a2d5-4c7d-8ff5-8249927e8e0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63b103d2-ef')#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.138 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.139 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.139 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No VIF found with MAC fa:16:3e:f0:7b:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.139 2 INFO nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Using config drive#033[00m
Oct  7 16:12:43 np0005474864 podman[222231]: 2025-10-07 20:12:43.370203195 +0000 UTC m=+0.061999304 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 16:12:43 np0005474864 nova_compute[192593]: 2025-10-07 20:12:43.419 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Successfully created port: 7ce9ef63-687e-420f-b85d-071abf475fd7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.269 2 INFO nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Creating config drive at /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config#033[00m
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.278 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz2cjjam2 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.419 2 DEBUG oslo_concurrency.processutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpz2cjjam2" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:44 np0005474864 kernel: tap63b103d2-ef: entered promiscuous mode
Oct  7 16:12:44 np0005474864 NetworkManager[51631]: <info>  [1759867964.5091] manager: (tap63b103d2-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/45)
Oct  7 16:12:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:44Z|00070|binding|INFO|Claiming lport 63b103d2-ef83-41aa-9080-3137adafe387 for this chassis.
Oct  7 16:12:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:44Z|00071|binding|INFO|63b103d2-ef83-41aa-9080-3137adafe387: Claiming fa:16:3e:f0:7b:b2 10.100.0.9
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.508 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.523 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:7b:b2 10.100.0.9'], port_security=['fa:16:3e:f0:7b:b2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f2a4ae00-d828-4178-880f-cc034629d96e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'd4fc7d5a-f2f2-4b9c-aee9-ab5a55c88c50', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71790900-696c-48c8-9845-d2f1ae8dbfc4, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=63b103d2-ef83-41aa-9080-3137adafe387) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.524 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 63b103d2-ef83-41aa-9080-3137adafe387 in datapath b153978d-a2d5-4c7d-8ff5-8249927e8e0f bound to our chassis#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.526 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b153978d-a2d5-4c7d-8ff5-8249927e8e0f#033[00m
Oct  7 16:12:44 np0005474864 systemd-machined[152586]: New machine qemu-5-instance-0000000f.
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.538 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2f7d1045-2b96-4c1d-b89c-214f71221da8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.540 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb153978d-a1 in ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.541 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb153978d-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.542 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d271b98b-1e95-436c-b614-117f760d6468]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.543 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d1c29745-27c6-456c-bd03-4cc736c48985]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.555 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[81be51b7-88c4-4e3f-8a7b-acd85a7389fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 systemd[1]: Started Virtual Machine qemu-5-instance-0000000f.
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:44Z|00072|binding|INFO|Setting lport 63b103d2-ef83-41aa-9080-3137adafe387 ovn-installed in OVS
Oct  7 16:12:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:44Z|00073|binding|INFO|Setting lport 63b103d2-ef83-41aa-9080-3137adafe387 up in Southbound
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.571 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.571 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[92656569-f92e-488e-9bdd-7321d48605f3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 systemd-udevd[222272]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:12:44 np0005474864 NetworkManager[51631]: <info>  [1759867964.5924] device (tap63b103d2-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:12:44 np0005474864 NetworkManager[51631]: <info>  [1759867964.5930] device (tap63b103d2-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.604 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[61c85ae4-d6c5-4bec-82c0-95081b03f30c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 NetworkManager[51631]: <info>  [1759867964.6105] manager: (tapb153978d-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/46)
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.610 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5ba4037c-cdf4-4b81-a2cf-e95593a19a76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 systemd-udevd[222276]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.646 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[d66ef218-65d7-4bba-942f-bc3dbc9e5a1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.650 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[31888add-c66e-4d76-8097-ed9dd2b86a1f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 NetworkManager[51631]: <info>  [1759867964.6700] device (tapb153978d-a0): carrier: link connected
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.674 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[0442d354-19fc-46a9-af6b-8f4a4800c69a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.694 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[47b22c7b-31b7-49fd-b51d-f44e8004d7e0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb153978d-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:e8:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360056, 'reachable_time': 42066, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222302, 'error': None, 'target': 'ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.714 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[595d66c1-5b99-4944-861f-927566fc1eff]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fee7:e8cb'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 360056, 'tstamp': 360056}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222303, 'error': None, 'target': 'ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.737 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8b3cb811-4b8d-4760-abfd-956d7c78ae36]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb153978d-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:e7:e8:cb'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 26], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360056, 'reachable_time': 42066, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222305, 'error': None, 'target': 'ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.791 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[24e67258-deaa-4774-aa57-c6782464b462]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.875 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ed2685bf-8f95-4636-a25c-28440cf697c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.877 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb153978d-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.879 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.879 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb153978d-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:44 np0005474864 kernel: tapb153978d-a0: entered promiscuous mode
Oct  7 16:12:44 np0005474864 NetworkManager[51631]: <info>  [1759867964.8826] manager: (tapb153978d-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/47)
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.883 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.893 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb153978d-a0, col_values=(('external_ids', {'iface-id': 'f34158c9-a766-4691-8248-3424f7b7ca88'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:44Z|00074|binding|INFO|Releasing lport f34158c9-a766-4691-8248-3424f7b7ca88 from this chassis (sb_readonly=0)
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.906 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Successfully created port: be7697b8-3851-4db2-8ae0-bc42997f1332 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:12:44 np0005474864 nova_compute[192593]: 2025-10-07 20:12:44.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.919 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b153978d-a2d5-4c7d-8ff5-8249927e8e0f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b153978d-a2d5-4c7d-8ff5-8249927e8e0f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.920 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c618266a-d72f-4859-86e8-b1b04e89fa6e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.921 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-b153978d-a2d5-4c7d-8ff5-8249927e8e0f
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/b153978d-a2d5-4c7d-8ff5-8249927e8e0f.pid.haproxy
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID b153978d-a2d5-4c7d-8ff5-8249927e8e0f
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:12:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:44.922 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'env', 'PROCESS_TAG=haproxy-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b153978d-a2d5-4c7d-8ff5-8249927e8e0f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.215 2 DEBUG nova.network.neutron [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updated VIF entry in instance network info cache for port 63b103d2-ef83-41aa-9080-3137adafe387. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.216 2 DEBUG nova.network.neutron [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.236 2 DEBUG oslo_concurrency.lockutils [req-c1d93341-4f8c-4916-b09b-f466e3cf25f8 req-e7b136e4-bcdf-4228-a1b5-86fea3a5fbe8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.299 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867965.2980542, f2a4ae00-d828-4178-880f-cc034629d96e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.299 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] VM Started (Lifecycle Event)#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.322 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.327 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867965.298599, f2a4ae00-d828-4178-880f-cc034629d96e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.327 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.354 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.357 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:12:45 np0005474864 nova_compute[192593]: 2025-10-07 20:12:45.378 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:12:45 np0005474864 podman[222343]: 2025-10-07 20:12:45.334539096 +0000 UTC m=+0.032032612 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:12:45 np0005474864 podman[222343]: 2025-10-07 20:12:45.67439156 +0000 UTC m=+0.371885056 container create 890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:12:45 np0005474864 systemd[1]: Started libpod-conmon-890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495.scope.
Oct  7 16:12:45 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:12:45 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a49fbdc4e04a8c0abd534c13cf95f7aac040979398de302462cafb60d4fabac2/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:12:45 np0005474864 podman[222343]: 2025-10-07 20:12:45.852229615 +0000 UTC m=+0.549723211 container init 890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  7 16:12:45 np0005474864 podman[222343]: 2025-10-07 20:12:45.857529828 +0000 UTC m=+0.555023364 container start 890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:12:45 np0005474864 neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f[222358]: [NOTICE]   (222362) : New worker (222364) forked
Oct  7 16:12:45 np0005474864 neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f[222358]: [NOTICE]   (222362) : Loading success.
Oct  7 16:12:47 np0005474864 nova_compute[192593]: 2025-10-07 20:12:47.180 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Successfully updated port: 7ce9ef63-687e-420f-b85d-071abf475fd7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:12:47 np0005474864 nova_compute[192593]: 2025-10-07 20:12:47.642 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.316 2 DEBUG nova.compute.manager [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-changed-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.316 2 DEBUG nova.compute.manager [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing instance network info cache due to event network-changed-7ce9ef63-687e-420f-b85d-071abf475fd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.317 2 DEBUG oslo_concurrency.lockutils [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.317 2 DEBUG oslo_concurrency.lockutils [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.318 2 DEBUG nova.network.neutron [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing network info cache for port 7ce9ef63-687e-420f-b85d-071abf475fd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.810 2 DEBUG nova.network.neutron [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:12:48 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:48.994 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:12:48 np0005474864 nova_compute[192593]: 2025-10-07 20:12:48.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:48 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:48.996 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:12:49 np0005474864 nova_compute[192593]: 2025-10-07 20:12:49.433 2 DEBUG nova.network.neutron [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:49 np0005474864 nova_compute[192593]: 2025-10-07 20:12:49.452 2 DEBUG oslo_concurrency.lockutils [req-b3768faa-0270-4467-abc4-dca01a7736c0 req-033e7dda-5f10-41d6-bca2-b975caef6c92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:12:50 np0005474864 nova_compute[192593]: 2025-10-07 20:12:50.254 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Successfully updated port: be7697b8-3851-4db2-8ae0-bc42997f1332 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:12:50 np0005474864 nova_compute[192593]: 2025-10-07 20:12:50.268 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:12:50 np0005474864 nova_compute[192593]: 2025-10-07 20:12:50.269 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquired lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:12:50 np0005474864 nova_compute[192593]: 2025-10-07 20:12:50.269 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:12:50 np0005474864 nova_compute[192593]: 2025-10-07 20:12:50.490 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:12:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:50.998 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:51 np0005474864 nova_compute[192593]: 2025-10-07 20:12:51.349 2 DEBUG nova.compute.manager [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-changed-be7697b8-3851-4db2-8ae0-bc42997f1332 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:51 np0005474864 nova_compute[192593]: 2025-10-07 20:12:51.350 2 DEBUG nova.compute.manager [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing instance network info cache due to event network-changed-be7697b8-3851-4db2-8ae0-bc42997f1332. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:12:51 np0005474864 nova_compute[192593]: 2025-10-07 20:12:51.351 2 DEBUG oslo_concurrency.lockutils [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:12:52 np0005474864 nova_compute[192593]: 2025-10-07 20:12:52.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.447 2 DEBUG nova.network.neutron [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updating instance_info_cache with network_info: [{"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.474 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Releasing lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.475 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Instance network_info: |[{"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.475 2 DEBUG oslo_concurrency.lockutils [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.476 2 DEBUG nova.network.neutron [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing network info cache for port be7697b8-3851-4db2-8ae0-bc42997f1332 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.484 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Start _get_guest_xml network_info=[{"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.492 2 WARNING nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.499 2 DEBUG nova.virt.libvirt.host [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.500 2 DEBUG nova.virt.libvirt.host [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.508 2 DEBUG nova.virt.libvirt.host [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.509 2 DEBUG nova.virt.libvirt.host [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.511 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.511 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.512 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.512 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.513 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.513 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.513 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.514 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.514 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.514 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.515 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.515 2 DEBUG nova.virt.hardware [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.519 2 DEBUG nova.virt.libvirt.vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:12:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-993173619',display_name='tempest-TestGettingAddress-server-993173619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-993173619',id=16,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAMiPzYFMApCNDc9mIgm8Ln/jp3Xg1XJGHFUgqwN9wF6viQJ53hy2WYN1ZdeMyDZf3WAgFTiR2n+wfAYIJY6IB6Pdd0KHjGmouJJqkn9TClJJZ0hKIENBfl0N2NXC/VOKw==',key_name='tempest-TestGettingAddress-1769694523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mycfmhvq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:12:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=1669315a-9455-4ddc-bddf-b5a535be9294,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.519 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.520 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:62:17,bridge_name='br-int',has_traffic_filtering=True,id=7ce9ef63-687e-420f-b85d-071abf475fd7,network=Network(48bc8cb5-7112-4ac0-bcc2-12066714d0ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ce9ef63-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.521 2 DEBUG nova.virt.libvirt.vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:12:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-993173619',display_name='tempest-TestGettingAddress-server-993173619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-993173619',id=16,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAMiPzYFMApCNDc9mIgm8Ln/jp3Xg1XJGHFUgqwN9wF6viQJ53hy2WYN1ZdeMyDZf3WAgFTiR2n+wfAYIJY6IB6Pdd0KHjGmouJJqkn9TClJJZ0hKIENBfl0N2NXC/VOKw==',key_name='tempest-TestGettingAddress-1769694523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mycfmhvq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:12:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=1669315a-9455-4ddc-bddf-b5a535be9294,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.522 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.523 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:38:3e,bridge_name='br-int',has_traffic_filtering=True,id=be7697b8-3851-4db2-8ae0-bc42997f1332,network=Network(50d1db2f-7e6a-4b01-96dc-cd47acf22206),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe7697b8-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.524 2 DEBUG nova.objects.instance [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 1669315a-9455-4ddc-bddf-b5a535be9294 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.546 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <uuid>1669315a-9455-4ddc-bddf-b5a535be9294</uuid>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <name>instance-00000010</name>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestGettingAddress-server-993173619</nova:name>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:12:53</nova:creationTime>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:user uuid="334f092941fc46c496c7def76b2cfe18">tempest-TestGettingAddress-626136673-project-member</nova:user>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:project uuid="2f9bf744045540618c9980fd4a7694f5">tempest-TestGettingAddress-626136673</nova:project>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:port uuid="7ce9ef63-687e-420f-b85d-071abf475fd7">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        <nova:port uuid="be7697b8-3851-4db2-8ae0-bc42997f1332">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fefc:383e" ipVersion="6"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <entry name="serial">1669315a-9455-4ddc-bddf-b5a535be9294</entry>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <entry name="uuid">1669315a-9455-4ddc-bddf-b5a535be9294</entry>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk.config"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:8f:62:17"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <target dev="tap7ce9ef63-68"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:fc:38:3e"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <target dev="tapbe7697b8-38"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/console.log" append="off"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:12:53 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:12:53 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:12:53 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:12:53 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.549 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Preparing to wait for external event network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.550 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.550 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.551 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.551 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Preparing to wait for external event network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.551 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.552 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.552 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.553 2 DEBUG nova.virt.libvirt.vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:12:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-993173619',display_name='tempest-TestGettingAddress-server-993173619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-993173619',id=16,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAMiPzYFMApCNDc9mIgm8Ln/jp3Xg1XJGHFUgqwN9wF6viQJ53hy2WYN1ZdeMyDZf3WAgFTiR2n+wfAYIJY6IB6Pdd0KHjGmouJJqkn9TClJJZ0hKIENBfl0N2NXC/VOKw==',key_name='tempest-TestGettingAddress-1769694523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mycfmhvq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:12:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=1669315a-9455-4ddc-bddf-b5a535be9294,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.554 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.555 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8f:62:17,bridge_name='br-int',has_traffic_filtering=True,id=7ce9ef63-687e-420f-b85d-071abf475fd7,network=Network(48bc8cb5-7112-4ac0-bcc2-12066714d0ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ce9ef63-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.556 2 DEBUG os_vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:62:17,bridge_name='br-int',has_traffic_filtering=True,id=7ce9ef63-687e-420f-b85d-071abf475fd7,network=Network(48bc8cb5-7112-4ac0-bcc2-12066714d0ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ce9ef63-68') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.557 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.558 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.562 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ce9ef63-68, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.563 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ce9ef63-68, col_values=(('external_ids', {'iface-id': '7ce9ef63-687e-420f-b85d-071abf475fd7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8f:62:17', 'vm-uuid': '1669315a-9455-4ddc-bddf-b5a535be9294'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.613 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 NetworkManager[51631]: <info>  [1759867973.6148] manager: (tap7ce9ef63-68): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.621 2 INFO os_vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8f:62:17,bridge_name='br-int',has_traffic_filtering=True,id=7ce9ef63-687e-420f-b85d-071abf475fd7,network=Network(48bc8cb5-7112-4ac0-bcc2-12066714d0ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ce9ef63-68')#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.622 2 DEBUG nova.virt.libvirt.vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:12:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-993173619',display_name='tempest-TestGettingAddress-server-993173619',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-993173619',id=16,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAMiPzYFMApCNDc9mIgm8Ln/jp3Xg1XJGHFUgqwN9wF6viQJ53hy2WYN1ZdeMyDZf3WAgFTiR2n+wfAYIJY6IB6Pdd0KHjGmouJJqkn9TClJJZ0hKIENBfl0N2NXC/VOKw==',key_name='tempest-TestGettingAddress-1769694523',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mycfmhvq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:12:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=1669315a-9455-4ddc-bddf-b5a535be9294,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.622 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.623 2 DEBUG nova.network.os_vif_util [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:38:3e,bridge_name='br-int',has_traffic_filtering=True,id=be7697b8-3851-4db2-8ae0-bc42997f1332,network=Network(50d1db2f-7e6a-4b01-96dc-cd47acf22206),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe7697b8-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.623 2 DEBUG os_vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:38:3e,bridge_name='br-int',has_traffic_filtering=True,id=be7697b8-3851-4db2-8ae0-bc42997f1332,network=Network(50d1db2f-7e6a-4b01-96dc-cd47acf22206),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe7697b8-38') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.623 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.624 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.624 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.626 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapbe7697b8-38, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.627 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapbe7697b8-38, col_values=(('external_ids', {'iface-id': 'be7697b8-3851-4db2-8ae0-bc42997f1332', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fc:38:3e', 'vm-uuid': '1669315a-9455-4ddc-bddf-b5a535be9294'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.628 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 NetworkManager[51631]: <info>  [1759867973.6291] manager: (tapbe7697b8-38): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.639 2 INFO os_vif [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:38:3e,bridge_name='br-int',has_traffic_filtering=True,id=be7697b8-3851-4db2-8ae0-bc42997f1332,network=Network(50d1db2f-7e6a-4b01-96dc-cd47acf22206),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe7697b8-38')#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.714 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.714 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.714 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:8f:62:17, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.714 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:fc:38:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:12:53 np0005474864 nova_compute[192593]: 2025-10-07 20:12:53.715 2 INFO nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Using config drive#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.548 2 INFO nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Creating config drive at /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk.config#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.553 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq_73wte_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.684 2 DEBUG oslo_concurrency.processutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpq_73wte_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.723 2 DEBUG nova.compute.manager [req-8daa0192-ab26-4321-a902-382d7d64d6a7 req-065ac210-45f9-4f9b-8054-7fc72616f723 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.725 2 DEBUG oslo_concurrency.lockutils [req-8daa0192-ab26-4321-a902-382d7d64d6a7 req-065ac210-45f9-4f9b-8054-7fc72616f723 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.725 2 DEBUG oslo_concurrency.lockutils [req-8daa0192-ab26-4321-a902-382d7d64d6a7 req-065ac210-45f9-4f9b-8054-7fc72616f723 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.726 2 DEBUG oslo_concurrency.lockutils [req-8daa0192-ab26-4321-a902-382d7d64d6a7 req-065ac210-45f9-4f9b-8054-7fc72616f723 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.726 2 DEBUG nova.compute.manager [req-8daa0192-ab26-4321-a902-382d7d64d6a7 req-065ac210-45f9-4f9b-8054-7fc72616f723 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Processing event network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.727 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Instance event wait completed in 9 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.732 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867974.731935, f2a4ae00-d828-4178-880f-cc034629d96e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.733 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.735 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.745 2 INFO nova.virt.libvirt.driver [-] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Instance spawned successfully.#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.746 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.7555] manager: (tap7ce9ef63-68): new Tun device (/org/freedesktop/NetworkManager/Devices/50)
Oct  7 16:12:54 np0005474864 kernel: tap7ce9ef63-68: entered promiscuous mode
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.763 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00075|binding|INFO|Claiming lport 7ce9ef63-687e-420f-b85d-071abf475fd7 for this chassis.
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00076|binding|INFO|7ce9ef63-687e-420f-b85d-071abf475fd7: Claiming fa:16:3e:8f:62:17 10.100.0.5
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.764 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.774 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.779 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.779 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.7806] manager: (tapbe7697b8-38): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.780 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.780 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.780 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.781 2 DEBUG nova.virt.libvirt.driver [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.800 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:62:17 10.100.0.5'], port_security=['fa:16:3e:8f:62:17 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1669315a-9455-4ddc-bddf-b5a535be9294', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a513f697-18f2-4f8c-b79e-8feb80b81d11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03b9f824-0816-49ca-b067-98a9e0b122ad, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=7ce9ef63-687e-420f-b85d-071abf475fd7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.801 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 7ce9ef63-687e-420f-b85d-071abf475fd7 in datapath 48bc8cb5-7112-4ac0-bcc2-12066714d0ea bound to our chassis#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.804 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 48bc8cb5-7112-4ac0-bcc2-12066714d0ea#033[00m
Oct  7 16:12:54 np0005474864 systemd-udevd[222400]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:12:54 np0005474864 systemd-udevd[222401]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.824 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.822 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[878f4a2b-cb31-49cc-9c3d-16a1e51fcc15]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.8257] device (tap7ce9ef63-68): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.823 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap48bc8cb5-71 in ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.826 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap48bc8cb5-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.826 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6d856a44-4a06-4732-8ba7-fa93eea82e31]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.8276] device (tap7ce9ef63-68): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.827 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4e65d780-c80d-4762-8cea-849df4587129]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.840 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[8f2711ef-3c41-4028-b721-82c23586c417]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 kernel: tapbe7697b8-38: entered promiscuous mode
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00077|binding|INFO|Claiming lport be7697b8-3851-4db2-8ae0-bc42997f1332 for this chassis.
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.847 2 INFO nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Took 19.69 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00078|binding|INFO|be7697b8-3851-4db2-8ae0-bc42997f1332: Claiming fa:16:3e:fc:38:3e 2001:db8::f816:3eff:fefc:383e
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.848 2 DEBUG nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.8505] device (tapbe7697b8-38): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.8517] device (tapbe7697b8-38): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:12:54 np0005474864 systemd-machined[152586]: New machine qemu-6-instance-00000010.
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00079|binding|INFO|Setting lport 7ce9ef63-687e-420f-b85d-071abf475fd7 ovn-installed in OVS
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00080|binding|INFO|Setting lport 7ce9ef63-687e-420f-b85d-071abf475fd7 up in Southbound
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.855 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.855 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:38:3e 2001:db8::f816:3eff:fefc:383e'], port_security=['fa:16:3e:fc:38:3e 2001:db8::f816:3eff:fefc:383e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefc:383e/64', 'neutron:device_id': '1669315a-9455-4ddc-bddf-b5a535be9294', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a513f697-18f2-4f8c-b79e-8feb80b81d11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d381511a-5493-4cd7-9663-2f55bb48bf91, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=be7697b8-3851-4db2-8ae0-bc42997f1332) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:12:54 np0005474864 systemd[1]: Started Virtual Machine qemu-6-instance-00000010.
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00081|binding|INFO|Setting lport be7697b8-3851-4db2-8ae0-bc42997f1332 ovn-installed in OVS
Oct  7 16:12:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:54Z|00082|binding|INFO|Setting lport be7697b8-3851-4db2-8ae0-bc42997f1332 up in Southbound
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.867 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[be2f714d-85fc-48ea-8ee5-547d94049644]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.867 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.901 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[c9a31a85-e347-4873-8040-c061b7611711]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.907 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b3b428e6-0967-496a-b747-c9b0ab1fffd6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.9086] manager: (tap48bc8cb5-70): new Veth device (/org/freedesktop/NetworkManager/Devices/52)
Oct  7 16:12:54 np0005474864 systemd-udevd[222405]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.951 2 INFO nova.compute.manager [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Took 20.23 seconds to build instance.#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.951 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[41b108c0-363a-47d1-9e1f-981039b07b87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.956 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[bf29d441-39ed-4bf3-88f8-fd57e9e72d87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:54 np0005474864 nova_compute[192593]: 2025-10-07 20:12:54.967 2 DEBUG oslo_concurrency.lockutils [None req-46e8b062-a93c-4af3-8358-563cdb6162fa fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 20.309s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:54 np0005474864 NetworkManager[51631]: <info>  [1759867974.9880] device (tap48bc8cb5-70): carrier: link connected
Oct  7 16:12:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:54.994 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[25cfedea-644e-43a1-82d8-1f89df706c46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.014 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6afe1088-4522-41b6-864d-d8643324da0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48bc8cb5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:de:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361088, 'reachable_time': 43970, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222435, 'error': None, 'target': 'ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.042 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c39ee6e7-6a4d-45d0-ba66-b99b51f6d4a7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea1:dee4'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 361088, 'tstamp': 361088}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222436, 'error': None, 'target': 'ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.060 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2c0be9fe-777f-4f9b-93e9-7038e57043e8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap48bc8cb5-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a1:de:e4'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361088, 'reachable_time': 43970, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222437, 'error': None, 'target': 'ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.096 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0a04e621-01ee-45da-b55b-967f1e73faaa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.162 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[62fa6aed-a782-4da9-819a-a89e9fc26ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.163 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48bc8cb5-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.164 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.164 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap48bc8cb5-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:55 np0005474864 kernel: tap48bc8cb5-70: entered promiscuous mode
Oct  7 16:12:55 np0005474864 NetworkManager[51631]: <info>  [1759867975.1668] manager: (tap48bc8cb5-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/53)
Oct  7 16:12:55 np0005474864 nova_compute[192593]: 2025-10-07 20:12:55.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.169 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap48bc8cb5-70, col_values=(('external_ids', {'iface-id': '8debe99b-81af-459e-8217-4b06af1ff98e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:55 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:55Z|00083|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:12:55 np0005474864 nova_compute[192593]: 2025-10-07 20:12:55.187 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.188 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/48bc8cb5-7112-4ac0-bcc2-12066714d0ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/48bc8cb5-7112-4ac0-bcc2-12066714d0ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.189 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[242b1313-d39c-4642-90c2-00501feaad2f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.190 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-48bc8cb5-7112-4ac0-bcc2-12066714d0ea
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/48bc8cb5-7112-4ac0-bcc2-12066714d0ea.pid.haproxy
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 48bc8cb5-7112-4ac0-bcc2-12066714d0ea
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.190 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'env', 'PROCESS_TAG=haproxy-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/48bc8cb5-7112-4ac0-bcc2-12066714d0ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:12:55 np0005474864 podman[222470]: 2025-10-07 20:12:55.641097123 +0000 UTC m=+0.085250803 container create f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  7 16:12:55 np0005474864 podman[222470]: 2025-10-07 20:12:55.586161613 +0000 UTC m=+0.030315303 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:12:55 np0005474864 systemd[1]: Started libpod-conmon-f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d.scope.
Oct  7 16:12:55 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:12:55 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e90366fa718a2598047dcea30085e935f589e25a16fbf914eb569e1889787251/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:12:55 np0005474864 podman[222470]: 2025-10-07 20:12:55.749029121 +0000 UTC m=+0.193182791 container init f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  7 16:12:55 np0005474864 podman[222470]: 2025-10-07 20:12:55.75528288 +0000 UTC m=+0.199436530 container start f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:12:55 np0005474864 neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea[222486]: [NOTICE]   (222490) : New worker (222499) forked
Oct  7 16:12:55 np0005474864 neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea[222486]: [NOTICE]   (222490) : Loading success.
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.827 103685 INFO neutron.agent.ovn.metadata.agent [-] Port be7697b8-3851-4db2-8ae0-bc42997f1332 in datapath 50d1db2f-7e6a-4b01-96dc-cd47acf22206 unbound from our chassis#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.830 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 50d1db2f-7e6a-4b01-96dc-cd47acf22206#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.842 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c2dfca2b-7e22-4e37-9218-d11b99fa9de1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.843 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap50d1db2f-71 in ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.845 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap50d1db2f-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.845 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[90a00c09-4569-42d6-a7ae-bee164e1abe5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.846 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1b5b9785-877d-4c5d-89a0-0babe5716ce4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.863 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[bbac238a-87d6-448a-8f11-aa2e2e955be8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 nova_compute[192593]: 2025-10-07 20:12:55.866 2 DEBUG nova.network.neutron [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updated VIF entry in instance network info cache for port be7697b8-3851-4db2-8ae0-bc42997f1332. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:12:55 np0005474864 nova_compute[192593]: 2025-10-07 20:12:55.867 2 DEBUG nova.network.neutron [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updating instance_info_cache with network_info: [{"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.885 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7a94be27-4efd-46f0-a058-a475d39fe8ad]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 nova_compute[192593]: 2025-10-07 20:12:55.889 2 DEBUG oslo_concurrency.lockutils [req-7845be36-6205-4a32-b0a1-889cdf7c9f53 req-a5f0da7f-9785-4a83-8155-f7d29793967a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.921 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[b36aed19-3098-46e1-9af4-b49747011adb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 NetworkManager[51631]: <info>  [1759867975.9309] manager: (tap50d1db2f-70): new Veth device (/org/freedesktop/NetworkManager/Devices/54)
Oct  7 16:12:55 np0005474864 systemd-udevd[222421]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.931 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9eba642d-8448-44ad-b476-0f11112240da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.973 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[b98b6b36-f296-40a0-b1f4-bca6ed73feea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:55.977 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[6f93e74e-34d2-4e02-aef8-edfa226f6e15]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 NetworkManager[51631]: <info>  [1759867976.0141] device (tap50d1db2f-70): carrier: link connected
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.022 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[02da0cb9-ec5c-4fc3-be1f-c292ec951bbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.045 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cbe4c26a-a0b8-4862-8a92-88a53bbcb3df]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50d1db2f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:49:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361190, 'reachable_time': 33182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222519, 'error': None, 'target': 'ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.073 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ccf0bf9b-cb09-4c25-9b52-3fe0beba8fb2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe55:492a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 361190, 'tstamp': 361190}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222520, 'error': None, 'target': 'ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.094 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[19fe4e5e-e99f-434c-b2da-c42cb227fbc9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap50d1db2f-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:55:49:2a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 30], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361190, 'reachable_time': 33182, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222521, 'error': None, 'target': 'ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.141 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d85cc2f9-691a-4464-b822-3f3570c96a1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.187 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f19317b7-0a19-49e1-b075-de5691b9fd07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.193 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50d1db2f-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.194 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.195 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50d1db2f-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:56 np0005474864 kernel: tap50d1db2f-70: entered promiscuous mode
Oct  7 16:12:56 np0005474864 NetworkManager[51631]: <info>  [1759867976.1995] manager: (tap50d1db2f-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/55)
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.205 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap50d1db2f-70, col_values=(('external_ids', {'iface-id': '35331a4a-db8c-4977-9b95-771260e3e40b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:12:56Z|00084|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.210 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/50d1db2f-7e6a-4b01-96dc-cd47acf22206.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/50d1db2f-7e6a-4b01-96dc-cd47acf22206.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.211 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[34865635-06d7-4fe3-998c-3a53c8cde87d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.213 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-50d1db2f-7e6a-4b01-96dc-cd47acf22206
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/50d1db2f-7e6a-4b01-96dc-cd47acf22206.pid.haproxy
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 50d1db2f-7e6a-4b01-96dc-cd47acf22206
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:12:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:12:56.214 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'env', 'PROCESS_TAG=haproxy-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/50d1db2f-7e6a-4b01-96dc-cd47acf22206.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.380 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867976.380209, 1669315a-9455-4ddc-bddf-b5a535be9294 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.381 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] VM Started (Lifecycle Event)#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.409 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.412 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867976.3804626, 1669315a-9455-4ddc-bddf-b5a535be9294 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.413 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.435 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.438 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:12:56 np0005474864 nova_compute[192593]: 2025-10-07 20:12:56.461 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:12:56 np0005474864 podman[222552]: 2025-10-07 20:12:56.615159821 +0000 UTC m=+0.035629493 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:12:56 np0005474864 podman[222552]: 2025-10-07 20:12:56.731850719 +0000 UTC m=+0.152320341 container create 2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:12:56 np0005474864 systemd[1]: Started libpod-conmon-2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57.scope.
Oct  7 16:12:56 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:12:56 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3fa58bd42a183b1a0281e2e1f4efd13479c2bebcf9fd05a792c046c1422813f9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:12:56 np0005474864 podman[222552]: 2025-10-07 20:12:56.844595334 +0000 UTC m=+0.265064976 container init 2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  7 16:12:56 np0005474864 podman[222552]: 2025-10-07 20:12:56.854996163 +0000 UTC m=+0.275465775 container start 2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:12:56 np0005474864 neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206[222574]: [NOTICE]   (222600) : New worker (222616) forked
Oct  7 16:12:56 np0005474864 neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206[222574]: [NOTICE]   (222600) : Loading success.
Oct  7 16:12:56 np0005474864 podman[222565]: 2025-10-07 20:12:56.887495585 +0000 UTC m=+0.100111923 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:12:56 np0005474864 podman[222568]: 2025-10-07 20:12:56.892149509 +0000 UTC m=+0.092168706 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.component=ubi9-minimal-container, vcs-type=git, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, version=9.6)
Oct  7 16:12:57 np0005474864 nova_compute[192593]: 2025-10-07 20:12:57.453 2 DEBUG nova.compute.manager [req-2676e945-72d7-40c5-9a56-b5de017ff083 req-990ca180-133e-4d65-8784-af359e578253 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:57 np0005474864 nova_compute[192593]: 2025-10-07 20:12:57.454 2 DEBUG oslo_concurrency.lockutils [req-2676e945-72d7-40c5-9a56-b5de017ff083 req-990ca180-133e-4d65-8784-af359e578253 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:57 np0005474864 nova_compute[192593]: 2025-10-07 20:12:57.454 2 DEBUG oslo_concurrency.lockutils [req-2676e945-72d7-40c5-9a56-b5de017ff083 req-990ca180-133e-4d65-8784-af359e578253 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:57 np0005474864 nova_compute[192593]: 2025-10-07 20:12:57.455 2 DEBUG oslo_concurrency.lockutils [req-2676e945-72d7-40c5-9a56-b5de017ff083 req-990ca180-133e-4d65-8784-af359e578253 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:57 np0005474864 nova_compute[192593]: 2025-10-07 20:12:57.455 2 DEBUG nova.compute.manager [req-2676e945-72d7-40c5-9a56-b5de017ff083 req-990ca180-133e-4d65-8784-af359e578253 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] No waiting events found dispatching network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:12:57 np0005474864 nova_compute[192593]: 2025-10-07 20:12:57.455 2 WARNING nova.compute.manager [req-2676e945-72d7-40c5-9a56-b5de017ff083 req-990ca180-133e-4d65-8784-af359e578253 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received unexpected event network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:12:57 np0005474864 nova_compute[192593]: 2025-10-07 20:12:57.648 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:58 np0005474864 nova_compute[192593]: 2025-10-07 20:12:58.629 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.808 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.808 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.809 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.809 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.809 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Processing event network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.809 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.809 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.810 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.810 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.810 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] No event matching network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 in dict_keys([('network-vif-plugged', 'be7697b8-3851-4db2-8ae0-bc42997f1332')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.810 2 WARNING nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received unexpected event network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.810 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.811 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.811 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.811 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.811 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Processing event network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.811 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.812 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.812 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.812 2 DEBUG oslo_concurrency.lockutils [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.812 2 DEBUG nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] No waiting events found dispatching network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.812 2 WARNING nova.compute.manager [req-0b5ec568-6fe3-46f5-9314-6264cd6b8106 req-9e629ff8-6df7-4a18-b524-f9cb4283fe84 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received unexpected event network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.813 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Instance event wait completed in 3 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.816 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759867979.816806, 1669315a-9455-4ddc-bddf-b5a535be9294 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.817 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.819 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.822 2 INFO nova.virt.libvirt.driver [-] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Instance spawned successfully.#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.822 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.838 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.845 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.849 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.849 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.850 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.850 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.851 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.851 2 DEBUG nova.virt.libvirt.driver [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.879 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.907 2 INFO nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Took 19.00 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.908 2 DEBUG nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:12:59 np0005474864 nova_compute[192593]: 2025-10-07 20:12:59.995 2 INFO nova.compute.manager [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Took 19.57 seconds to build instance.#033[00m
Oct  7 16:13:00 np0005474864 nova_compute[192593]: 2025-10-07 20:13:00.018 2 DEBUG oslo_concurrency.lockutils [None req-a4577f51-c579-4bb3-8e5d-3d928b531642 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.682s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:02 np0005474864 nova_compute[192593]: 2025-10-07 20:13:02.066 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:02 np0005474864 NetworkManager[51631]: <info>  [1759867982.0672] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Oct  7 16:13:02 np0005474864 NetworkManager[51631]: <info>  [1759867982.0682] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Oct  7 16:13:02 np0005474864 nova_compute[192593]: 2025-10-07 20:13:02.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:02 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:02Z|00085|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:13:02 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:02Z|00086|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:13:02 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:02Z|00087|binding|INFO|Releasing lport f34158c9-a766-4691-8248-3424f7b7ca88 from this chassis (sb_readonly=0)
Oct  7 16:13:02 np0005474864 nova_compute[192593]: 2025-10-07 20:13:02.352 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:02 np0005474864 podman[222629]: 2025-10-07 20:13:02.392448807 +0000 UTC m=+0.073429577 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:13:02 np0005474864 podman[222627]: 2025-10-07 20:13:02.412468862 +0000 UTC m=+0.081381966 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=iscsid)
Oct  7 16:13:02 np0005474864 podman[222628]: 2025-10-07 20:13:02.416910289 +0000 UTC m=+0.110874752 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  7 16:13:02 np0005474864 nova_compute[192593]: 2025-10-07 20:13:02.698 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:03 np0005474864 nova_compute[192593]: 2025-10-07 20:13:03.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:04 np0005474864 nova_compute[192593]: 2025-10-07 20:13:04.456 2 DEBUG nova.compute.manager [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-changed-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:04 np0005474864 nova_compute[192593]: 2025-10-07 20:13:04.457 2 DEBUG nova.compute.manager [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing instance network info cache due to event network-changed-63b103d2-ef83-41aa-9080-3137adafe387. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:13:04 np0005474864 nova_compute[192593]: 2025-10-07 20:13:04.457 2 DEBUG oslo_concurrency.lockutils [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:13:04 np0005474864 nova_compute[192593]: 2025-10-07 20:13:04.458 2 DEBUG oslo_concurrency.lockutils [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:13:04 np0005474864 nova_compute[192593]: 2025-10-07 20:13:04.458 2 DEBUG nova.network.neutron [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing network info cache for port 63b103d2-ef83-41aa-9080-3137adafe387 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.091 2 DEBUG nova.network.neutron [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updated VIF entry in instance network info cache for port 63b103d2-ef83-41aa-9080-3137adafe387. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.091 2 DEBUG nova.network.neutron [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.111 2 DEBUG oslo_concurrency.lockutils [req-bab74018-de09-4c4b-a9f8-53850ff3d3f3 req-51fa3150-83ad-4133-9916-42715c84b1e7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.550 2 DEBUG nova.compute.manager [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-changed-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.551 2 DEBUG nova.compute.manager [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing instance network info cache due to event network-changed-7ce9ef63-687e-420f-b85d-071abf475fd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.551 2 DEBUG oslo_concurrency.lockutils [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.551 2 DEBUG oslo_concurrency.lockutils [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:13:06 np0005474864 nova_compute[192593]: 2025-10-07 20:13:06.551 2 DEBUG nova.network.neutron [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing network info cache for port 7ce9ef63-687e-420f-b85d-071abf475fd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:13:07 np0005474864 podman[222698]: 2025-10-07 20:13:07.409583064 +0000 UTC m=+0.089533419 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Oct  7 16:13:07 np0005474864 nova_compute[192593]: 2025-10-07 20:13:07.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:08 np0005474864 nova_compute[192593]: 2025-10-07 20:13:08.634 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:10 np0005474864 nova_compute[192593]: 2025-10-07 20:13:10.941 2 DEBUG nova.network.neutron [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updated VIF entry in instance network info cache for port 7ce9ef63-687e-420f-b85d-071abf475fd7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:13:10 np0005474864 nova_compute[192593]: 2025-10-07 20:13:10.943 2 DEBUG nova.network.neutron [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updating instance_info_cache with network_info: [{"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:10 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:10Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f0:7b:b2 10.100.0.9
Oct  7 16:13:10 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:10Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f0:7b:b2 10.100.0.9
Oct  7 16:13:10 np0005474864 nova_compute[192593]: 2025-10-07 20:13:10.995 2 DEBUG oslo_concurrency.lockutils [req-c36712f3-0c88-4498-8c9c-393c8f9ad024 req-a142f466-a025-4479-9aaf-b4a9ba1e2ea3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:13:11 np0005474864 podman[222729]: 2025-10-07 20:13:11.382371417 +0000 UTC m=+0.072379828 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:13:12 np0005474864 nova_compute[192593]: 2025-10-07 20:13:12.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:13 np0005474864 nova_compute[192593]: 2025-10-07 20:13:13.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:14Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8f:62:17 10.100.0.5
Oct  7 16:13:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:14Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8f:62:17 10.100.0.5
Oct  7 16:13:14 np0005474864 podman[222768]: 2025-10-07 20:13:14.418502275 +0000 UTC m=+0.105774726 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.016 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.052 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Triggering sync for uuid f2a4ae00-d828-4178-880f-cc034629d96e _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.053 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Triggering sync for uuid 1669315a-9455-4ddc-bddf-b5a535be9294 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.053 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.054 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "f2a4ae00-d828-4178-880f-cc034629d96e" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.054 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.054 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "1669315a-9455-4ddc-bddf-b5a535be9294" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.107 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "1669315a-9455-4ddc-bddf-b5a535be9294" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.053s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:16 np0005474864 nova_compute[192593]: 2025-10-07 20:13:16.108 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "f2a4ae00-d828-4178-880f-cc034629d96e" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.054s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:16.186 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:16.187 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:16.188 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:17 np0005474864 nova_compute[192593]: 2025-10-07 20:13:17.161 2 INFO nova.compute.manager [None req-39717be0-a4cb-4d8a-a479-536f580b3b77 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Get console output#033[00m
Oct  7 16:13:17 np0005474864 nova_compute[192593]: 2025-10-07 20:13:17.170 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:13:17 np0005474864 nova_compute[192593]: 2025-10-07 20:13:17.754 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:18 np0005474864 nova_compute[192593]: 2025-10-07 20:13:18.131 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:18 np0005474864 nova_compute[192593]: 2025-10-07 20:13:18.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.128 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.130 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.130 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.218 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.320 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.321 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.411 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.419 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.509 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.511 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.598 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.827 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.828 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5401MB free_disk=73.40628433227539GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.828 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:19 np0005474864 nova_compute[192593]: 2025-10-07 20:13:19.829 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.027 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance f2a4ae00-d828-4178-880f-cc034629d96e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.027 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 1669315a-9455-4ddc-bddf-b5a535be9294 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.028 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.028 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.254 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.269 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.285 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.286 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.286 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:20 np0005474864 nova_compute[192593]: 2025-10-07 20:13:20.286 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  7 16:13:21 np0005474864 nova_compute[192593]: 2025-10-07 20:13:21.297 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:22 np0005474864 nova_compute[192593]: 2025-10-07 20:13:22.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:22 np0005474864 nova_compute[192593]: 2025-10-07 20:13:22.700 2 DEBUG oslo_concurrency.lockutils [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "interface-f2a4ae00-d828-4178-880f-cc034629d96e-None" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:22 np0005474864 nova_compute[192593]: 2025-10-07 20:13:22.701 2 DEBUG oslo_concurrency.lockutils [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "interface-f2a4ae00-d828-4178-880f-cc034629d96e-None" acquired by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:22 np0005474864 nova_compute[192593]: 2025-10-07 20:13:22.702 2 DEBUG nova.objects.instance [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'flavor' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:13:22 np0005474864 nova_compute[192593]: 2025-10-07 20:13:22.756 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:23 np0005474864 nova_compute[192593]: 2025-10-07 20:13:23.087 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:23 np0005474864 nova_compute[192593]: 2025-10-07 20:13:23.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:23 np0005474864 nova_compute[192593]: 2025-10-07 20:13:23.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:13:23 np0005474864 nova_compute[192593]: 2025-10-07 20:13:23.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:13:23 np0005474864 nova_compute[192593]: 2025-10-07 20:13:23.644 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:24 np0005474864 nova_compute[192593]: 2025-10-07 20:13:24.142 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:13:24 np0005474864 nova_compute[192593]: 2025-10-07 20:13:24.143 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:13:24 np0005474864 nova_compute[192593]: 2025-10-07 20:13:24.143 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  7 16:13:24 np0005474864 nova_compute[192593]: 2025-10-07 20:13:24.143 2 DEBUG nova.objects.instance [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:13:24 np0005474864 nova_compute[192593]: 2025-10-07 20:13:24.245 2 DEBUG nova.objects.instance [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'pci_requests' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:13:24 np0005474864 nova_compute[192593]: 2025-10-07 20:13:24.264 2 DEBUG nova.network.neutron [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:13:24 np0005474864 nova_compute[192593]: 2025-10-07 20:13:24.652 2 DEBUG nova.policy [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:13:25 np0005474864 nova_compute[192593]: 2025-10-07 20:13:25.984 2 DEBUG nova.network.neutron [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Successfully created port: d400112d-f8c8-435e-a3c9-93cf89fe4cb2 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.288 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.309 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.309 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.310 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.310 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.311 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.311 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.311 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.311 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.333 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.333 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:27 np0005474864 podman[222802]: 2025-10-07 20:13:27.404620525 +0000 UTC m=+0.085077512 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, name=ubi9-minimal, distribution-scope=public, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7)
Oct  7 16:13:27 np0005474864 podman[222801]: 2025-10-07 20:13:27.432278919 +0000 UTC m=+0.107695561 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:13:27 np0005474864 nova_compute[192593]: 2025-10-07 20:13:27.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:28 np0005474864 nova_compute[192593]: 2025-10-07 20:13:28.338 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:13:28 np0005474864 nova_compute[192593]: 2025-10-07 20:13:28.647 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:29 np0005474864 nova_compute[192593]: 2025-10-07 20:13:29.436 2 DEBUG nova.network.neutron [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Successfully updated port: d400112d-f8c8-435e-a3c9-93cf89fe4cb2 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:13:29 np0005474864 nova_compute[192593]: 2025-10-07 20:13:29.452 2 DEBUG oslo_concurrency.lockutils [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:13:29 np0005474864 nova_compute[192593]: 2025-10-07 20:13:29.452 2 DEBUG oslo_concurrency.lockutils [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:13:29 np0005474864 nova_compute[192593]: 2025-10-07 20:13:29.452 2 DEBUG nova.network.neutron [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:13:32 np0005474864 nova_compute[192593]: 2025-10-07 20:13:32.477 2 DEBUG nova.compute.manager [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-changed-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:32 np0005474864 nova_compute[192593]: 2025-10-07 20:13:32.477 2 DEBUG nova.compute.manager [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing instance network info cache due to event network-changed-d400112d-f8c8-435e-a3c9-93cf89fe4cb2. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:13:32 np0005474864 nova_compute[192593]: 2025-10-07 20:13:32.478 2 DEBUG oslo_concurrency.lockutils [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:13:32 np0005474864 nova_compute[192593]: 2025-10-07 20:13:32.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:32 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:32Z|00088|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:13:32 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:32Z|00089|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:13:32 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:32Z|00090|binding|INFO|Releasing lport f34158c9-a766-4691-8248-3424f7b7ca88 from this chassis (sb_readonly=0)
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.063 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 podman[222851]: 2025-10-07 20:13:33.401684687 +0000 UTC m=+0.073870700 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Oct  7 16:13:33 np0005474864 podman[222849]: 2025-10-07 20:13:33.41502195 +0000 UTC m=+0.106676252 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid)
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.480 2 DEBUG nova.network.neutron [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:33 np0005474864 podman[222850]: 2025-10-07 20:13:33.511371934 +0000 UTC m=+0.193052520 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.512 2 DEBUG oslo_concurrency.lockutils [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.515 2 DEBUG oslo_concurrency.lockutils [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.515 2 DEBUG nova.network.neutron [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing network info cache for port d400112d-f8c8-435e-a3c9-93cf89fe4cb2 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.519 2 DEBUG nova.virt.libvirt.vif [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:54Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.519 2 DEBUG nova.network.os_vif_util [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.519 2 DEBUG nova.network.os_vif_util [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.520 2 DEBUG os_vif [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.522 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.523 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.527 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.527 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd400112d-f8, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.528 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapd400112d-f8, col_values=(('external_ids', {'iface-id': 'd400112d-f8c8-435e-a3c9-93cf89fe4cb2', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8b:79:39', 'vm-uuid': 'f2a4ae00-d828-4178-880f-cc034629d96e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:13:33 np0005474864 NetworkManager[51631]: <info>  [1759868013.5346] manager: (tapd400112d-f8): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/58)
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.543 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.546 2 INFO os_vif [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8')#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.548 2 DEBUG nova.virt.libvirt.vif [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:54Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.548 2 DEBUG nova.network.os_vif_util [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.549 2 DEBUG nova.network.os_vif_util [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.553 2 DEBUG nova.virt.libvirt.guest [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] attach device xml: <interface type="ethernet">
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <mac address="fa:16:3e:8b:79:39"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <model type="virtio"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <mtu size="1442"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <target dev="tapd400112d-f8"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]: </interface>
Oct  7 16:13:33 np0005474864 nova_compute[192593]: attach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:339#033[00m
Oct  7 16:13:33 np0005474864 NetworkManager[51631]: <info>  [1759868013.5688] manager: (tapd400112d-f8): new Tun device (/org/freedesktop/NetworkManager/Devices/59)
Oct  7 16:13:33 np0005474864 kernel: tapd400112d-f8: entered promiscuous mode
Oct  7 16:13:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:33Z|00091|binding|INFO|Claiming lport d400112d-f8c8-435e-a3c9-93cf89fe4cb2 for this chassis.
Oct  7 16:13:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:33Z|00092|binding|INFO|d400112d-f8c8-435e-a3c9-93cf89fe4cb2: Claiming fa:16:3e:8b:79:39 10.100.0.22
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.577 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.600 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:79:39 10.100.0.22'], port_security=['fa:16:3e:8b:79:39 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'f2a4ae00-d828-4178-880f-cc034629d96e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5280796-8b7c-4945-a626-715e8ea6e041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b761b10b-19d2-4034-8cff-5d7b8a0481c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=749d8a78-c1fd-454b-b141-0fdb13fd4916, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=d400112d-f8c8-435e-a3c9-93cf89fe4cb2) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.603 103685 INFO neutron.agent.ovn.metadata.agent [-] Port d400112d-f8c8-435e-a3c9-93cf89fe4cb2 in datapath c5280796-8b7c-4945-a626-715e8ea6e041 bound to our chassis#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.607 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c5280796-8b7c-4945-a626-715e8ea6e041#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.619 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1e4c6661-7fcc-411c-8a88-e352b283554f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.620 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc5280796-81 in ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.622 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc5280796-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.623 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9362deb8-de33-4e61-bf56-723ce0eebcec]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.624 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d25fd736-76e4-4444-a43d-88136bb44020]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 systemd-udevd[222920]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.642 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[16cd59ff-ad16-4f88-ac3f-a79122898d84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 NetworkManager[51631]: <info>  [1759868013.6497] device (tapd400112d-f8): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:13:33 np0005474864 NetworkManager[51631]: <info>  [1759868013.6507] device (tapd400112d-f8): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:13:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:33Z|00093|binding|INFO|Setting lport d400112d-f8c8-435e-a3c9-93cf89fe4cb2 ovn-installed in OVS
Oct  7 16:13:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:33Z|00094|binding|INFO|Setting lport d400112d-f8c8-435e-a3c9-93cf89fe4cb2 up in Southbound
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.654 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.658 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[355b6b7b-b53f-48aa-9aac-3a1237e5e0d1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.679 2 DEBUG nova.virt.libvirt.driver [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.679 2 DEBUG nova.virt.libvirt.driver [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.679 2 DEBUG nova.virt.libvirt.driver [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No VIF found with MAC fa:16:3e:f0:7b:b2, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.680 2 DEBUG nova.virt.libvirt.driver [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No VIF found with MAC fa:16:3e:8b:79:39, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.690 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[32f94ee1-9b27-404a-994c-f289790e7e75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 NetworkManager[51631]: <info>  [1759868013.6974] manager: (tapc5280796-80): new Veth device (/org/freedesktop/NetworkManager/Devices/60)
Oct  7 16:13:33 np0005474864 systemd-udevd[222924]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.698 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[bb604624-35e4-4f93-8e30-b8abdab1ab1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.702 2 DEBUG nova.virt.libvirt.guest [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <nova:creationTime>2025-10-07 20:13:33</nova:creationTime>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <nova:flavor name="m1.nano">
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:memory>128</nova:memory>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:disk>1</nova:disk>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:swap>0</nova:swap>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:vcpus>1</nova:vcpus>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  </nova:flavor>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <nova:owner>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  </nova:owner>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  <nova:ports>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:13:33 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    <nova:port uuid="d400112d-f8c8-435e-a3c9-93cf89fe4cb2">
Oct  7 16:13:33 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:33 np0005474864 nova_compute[192593]:  </nova:ports>
Oct  7 16:13:33 np0005474864 nova_compute[192593]: </nova:instance>
Oct  7 16:13:33 np0005474864 nova_compute[192593]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.736 2 DEBUG oslo_concurrency.lockutils [None req-a340ec53-6960-41ad-8e91-3cdc2d6e10d0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "interface-f2a4ae00-d828-4178-880f-cc034629d96e-None" "released" by "nova.compute.manager.ComputeManager.attach_interface.<locals>.do_attach_interface" :: held 11.035s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.739 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[d865cb28-cc7f-47ad-aa9f-7a4a518e0a09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.743 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[c0d2c3af-ad53-4e0e-a327-58654d669233]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 NetworkManager[51631]: <info>  [1759868013.7676] device (tapc5280796-80): carrier: link connected
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.774 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[10c49a6e-7e41-4e07-9508-607fdb8ee01b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.793 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8910d242-4565-4c78-bc32-723b4556838a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc5280796-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:3a:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364966, 'reachable_time': 35539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222946, 'error': None, 'target': 'ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.807 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[bd898a6c-5fe9-4537-b1b6-7c6bda7e9d7d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feed:3a4e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 364966, 'tstamp': 364966}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222947, 'error': None, 'target': 'ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.825 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e86f08-3920-4d2e-9d34-eb08830b0d1f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc5280796-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ed:3a:4e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 32], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364966, 'reachable_time': 35539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222948, 'error': None, 'target': 'ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.852 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cba8c92d-eb81-49d6-b2d8-1809ce9a4a86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.916 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e9560963-8382-46ac-a439-0aca3d42fd07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.918 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5280796-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.918 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.919 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc5280796-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.921 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 NetworkManager[51631]: <info>  [1759868013.9228] manager: (tapc5280796-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Oct  7 16:13:33 np0005474864 kernel: tapc5280796-80: entered promiscuous mode
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.929 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc5280796-80, col_values=(('external_ids', {'iface-id': '39dd9731-e00b-4068-94b2-55cd8dfde113'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:33Z|00095|binding|INFO|Releasing lport 39dd9731-e00b-4068-94b2-55cd8dfde113 from this chassis (sb_readonly=0)
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 nova_compute[192593]: 2025-10-07 20:13:33.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.959 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c5280796-8b7c-4945-a626-715e8ea6e041.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c5280796-8b7c-4945-a626-715e8ea6e041.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.960 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d910febe-1e4f-4538-97b9-24fe26d926d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.962 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-c5280796-8b7c-4945-a626-715e8ea6e041
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/c5280796-8b7c-4945-a626-715e8ea6e041.pid.haproxy
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID c5280796-8b7c-4945-a626-715e8ea6e041
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:13:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:33.963 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041', 'env', 'PROCESS_TAG=haproxy-c5280796-8b7c-4945-a626-715e8ea6e041', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c5280796-8b7c-4945-a626-715e8ea6e041.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:13:34 np0005474864 podman[222980]: 2025-10-07 20:13:34.400025231 +0000 UTC m=+0.062717211 container create 4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:13:34 np0005474864 systemd[1]: Started libpod-conmon-4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638.scope.
Oct  7 16:13:34 np0005474864 podman[222980]: 2025-10-07 20:13:34.360850497 +0000 UTC m=+0.023542557 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:13:34 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:13:34 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c6ed1782d16cb1274bad95d4b2b4d6e2a5eafbf934d2a876e65b06738b2e2402/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:13:34 np0005474864 podman[222980]: 2025-10-07 20:13:34.511009095 +0000 UTC m=+0.173701155 container init 4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  7 16:13:34 np0005474864 podman[222980]: 2025-10-07 20:13:34.52092549 +0000 UTC m=+0.183617500 container start 4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  7 16:13:34 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [NOTICE]   (223000) : New worker (223002) forked
Oct  7 16:13:34 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [NOTICE]   (223000) : Loading success.
Oct  7 16:13:36 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:36Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8b:79:39 10.100.0.22
Oct  7 16:13:36 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:36Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8b:79:39 10.100.0.22
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.566 2 DEBUG nova.compute.manager [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.568 2 DEBUG oslo_concurrency.lockutils [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.568 2 DEBUG oslo_concurrency.lockutils [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.569 2 DEBUG oslo_concurrency.lockutils [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.570 2 DEBUG nova.compute.manager [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] No waiting events found dispatching network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.570 2 WARNING nova.compute.manager [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received unexpected event network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.571 2 DEBUG nova.compute.manager [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.571 2 DEBUG oslo_concurrency.lockutils [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.572 2 DEBUG oslo_concurrency.lockutils [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.572 2 DEBUG oslo_concurrency.lockutils [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.573 2 DEBUG nova.compute.manager [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] No waiting events found dispatching network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.573 2 WARNING nova.compute.manager [req-29f00ec5-d8d1-4b93-a4e3-bc35718c61f2 req-56543c4e-9309-4ac3-8f9d-93e5f6baf539 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received unexpected event network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.783 2 DEBUG oslo_concurrency.lockutils [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "interface-f2a4ae00-d828-4178-880f-cc034629d96e-d400112d-f8c8-435e-a3c9-93cf89fe4cb2" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.784 2 DEBUG oslo_concurrency.lockutils [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "interface-f2a4ae00-d828-4178-880f-cc034629d96e-d400112d-f8c8-435e-a3c9-93cf89fe4cb2" acquired by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.801 2 DEBUG nova.objects.instance [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'flavor' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.826 2 DEBUG nova.virt.libvirt.vif [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:54Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.826 2 DEBUG nova.network.os_vif_util [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.827 2 DEBUG nova.network.os_vif_util [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.832 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.835 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.838 2 DEBUG nova.virt.libvirt.driver [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Attempting to detach device tapd400112d-f8 from instance f2a4ae00-d828-4178-880f-cc034629d96e from the persistent domain config. _detach_from_persistent /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2487#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.838 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] detach device xml: <interface type="ethernet">
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <mac address="fa:16:3e:8b:79:39"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <model type="virtio"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <mtu size="1442"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <target dev="tapd400112d-f8"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]: </interface>
Oct  7 16:13:36 np0005474864 nova_compute[192593]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.846 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.849 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface>not found in domain: <domain type='kvm' id='5'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <name>instance-0000000f</name>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <uuid>f2a4ae00-d828-4178-880f-cc034629d96e</uuid>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <nova:creationTime>2025-10-07 20:13:33</nova:creationTime>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <nova:flavor name="m1.nano">
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:memory>128</nova:memory>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:disk>1</nova:disk>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:swap>0</nova:swap>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:vcpus>1</nova:vcpus>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </nova:flavor>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <nova:owner>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </nova:owner>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <nova:ports>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <nova:port uuid="d400112d-f8c8-435e-a3c9-93cf89fe4cb2">
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </nova:ports>
Oct  7 16:13:36 np0005474864 nova_compute[192593]: </nova:instance>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <memory unit='KiB'>131072</memory>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <vcpu placement='static'>1</vcpu>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <resource>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <partition>/machine</partition>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </resource>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <sysinfo type='smbios'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <entry name='manufacturer'>RDO</entry>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <entry name='product'>OpenStack Compute</entry>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <entry name='serial'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <entry name='uuid'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <entry name='family'>Virtual Machine</entry>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <boot dev='hd'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <smbios mode='sysinfo'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <vmcoreinfo state='on'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <cpu mode='custom' match='exact' check='full'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <model fallback='forbid'>Nehalem</model>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <feature policy='require' name='x2apic'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <feature policy='require' name='hypervisor'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <feature policy='require' name='vme'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <clock offset='utc'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <timer name='pit' tickpolicy='delay'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <timer name='hpet' present='no'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <on_poweroff>destroy</on_poweroff>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <on_reboot>restart</on_reboot>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <on_crash>destroy</on_crash>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <disk type='file' device='disk'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk' index='2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <backingStore type='file' index='3'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:        <format type='raw'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:        <source file='/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:        <backingStore/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      </backingStore>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target dev='vda' bus='virtio'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='virtio-disk0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <disk type='file' device='cdrom'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <driver name='qemu' type='raw' cache='none'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config' index='1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <backingStore/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target dev='sda' bus='sata'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <readonly/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='sata0-0-0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='0' model='pcie-root'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pcie.0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='1' port='0x10'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='2' port='0x11'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='3' port='0x12'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.3'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='4' port='0x13'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.4'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='5' port='0x14'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.5'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='6' port='0x15'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.6'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='7' port='0x16'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.7'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='8' port='0x17'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.8'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='9' port='0x18'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.9'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='10' port='0x19'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.10'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='11' port='0x1a'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.11'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='12' port='0x1b'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.12'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='13' port='0x1c'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.13'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='14' port='0x1d'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.14'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='15' port='0x1e'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.15'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='16' port='0x1f'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.16'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='17' port='0x20'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.17'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='18' port='0x21'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.18'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='19' port='0x22'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.19'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='20' port='0x23'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.20'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='21' port='0x24'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.21'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='22' port='0x25'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.22'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='23' port='0x26'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.23'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='24' port='0x27'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.24'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target chassis='25' port='0x28'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.25'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model name='pcie-pci-bridge'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='pci.26'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='usb'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <controller type='sata' index='0'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='ide'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <interface type='ethernet'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <mac address='fa:16:3e:f0:7b:b2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target dev='tap63b103d2-ef'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model type='virtio'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <driver name='vhost' rx_queue_size='512'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <mtu size='1442'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='net0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <interface type='ethernet'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <mac address='fa:16:3e:8b:79:39'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target dev='tapd400112d-f8'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model type='virtio'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <driver name='vhost' rx_queue_size='512'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <mtu size='1442'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='net1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x06' slot='0x00' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <serial type='pty'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target type='isa-serial' port='0'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:        <model name='isa-serial'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      </target>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <console type='pty' tty='/dev/pts/0'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <target type='serial' port='0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </console>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <input type='tablet' bus='usb'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='input0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='usb' bus='0' port='1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <input type='mouse' bus='ps2'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='input1'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <input type='keyboard' bus='ps2'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='input2'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <listen type='address' address='::0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <audio id='1' type='none'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <model type='virtio' heads='1' primary='yes'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='video0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <watchdog model='itco' action='reset'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='watchdog0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </watchdog>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <memballoon model='virtio'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <stats period='10'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='balloon0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <rng model='virtio'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <backend model='random'>/dev/urandom</backend>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <alias name='rng0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <label>system_u:system_r:svirt_t:s0:c236,c324</label>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c236,c324</imagelabel>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <label>+107:+107</label>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:    <imagelabel>+107:+107</imagelabel>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:36 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:13:36 np0005474864 nova_compute[192593]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.850 2 INFO nova.virt.libvirt.driver [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully detached device tapd400112d-f8 from instance f2a4ae00-d828-4178-880f-cc034629d96e from the persistent domain config.#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.850 2 DEBUG nova.virt.libvirt.driver [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] (1/8): Attempting to detach device tapd400112d-f8 with device alias net1 from instance f2a4ae00-d828-4178-880f-cc034629d96e from the live domain config. _detach_from_live_with_retry /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2523#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.850 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] detach device xml: <interface type="ethernet">
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <mac address="fa:16:3e:8b:79:39"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <model type="virtio"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <mtu size="1442"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]:  <target dev="tapd400112d-f8"/>
Oct  7 16:13:36 np0005474864 nova_compute[192593]: </interface>
Oct  7 16:13:36 np0005474864 nova_compute[192593]: detach_device /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:465#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.869 2 DEBUG nova.network.neutron [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updated VIF entry in instance network info cache for port d400112d-f8c8-435e-a3c9-93cf89fe4cb2. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.869 2 DEBUG nova.network.neutron [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.887 2 DEBUG oslo_concurrency.lockutils [req-cf570cc1-2542-4df4-9fde-40cbd41a61e7 req-bd595df9-5ae7-4bf4-acdb-7628f0175cae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:13:36 np0005474864 kernel: tapd400112d-f8 (unregistering): left promiscuous mode
Oct  7 16:13:36 np0005474864 NetworkManager[51631]: <info>  [1759868016.9539] device (tapd400112d-f8): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:13:36 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:36Z|00096|binding|INFO|Releasing lport d400112d-f8c8-435e-a3c9-93cf89fe4cb2 from this chassis (sb_readonly=0)
Oct  7 16:13:36 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:36Z|00097|binding|INFO|Setting lport d400112d-f8c8-435e-a3c9-93cf89fe4cb2 down in Southbound
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:36 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:36Z|00098|binding|INFO|Removing iface tapd400112d-f8 ovn-installed in OVS
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:36 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:36.980 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8b:79:39 10.100.0.22'], port_security=['fa:16:3e:8b:79:39 10.100.0.22'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.22/28', 'neutron:device_id': 'f2a4ae00-d828-4178-880f-cc034629d96e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c5280796-8b7c-4945-a626-715e8ea6e041', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b761b10b-19d2-4034-8cff-5d7b8a0481c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=749d8a78-c1fd-454b-b141-0fdb13fd4916, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=d400112d-f8c8-435e-a3c9-93cf89fe4cb2) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:13:36 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:36.983 103685 INFO neutron.agent.ovn.metadata.agent [-] Port d400112d-f8c8-435e-a3c9-93cf89fe4cb2 in datapath c5280796-8b7c-4945-a626-715e8ea6e041 unbound from our chassis#033[00m
Oct  7 16:13:36 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:36.986 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c5280796-8b7c-4945-a626-715e8ea6e041, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.989 2 DEBUG nova.virt.libvirt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Received event <DeviceRemovedEvent: 1759868016.9890993, f2a4ae00-d828-4178-880f-cc034629d96e => net1> from libvirt while the driver is waiting for it; dispatched. emit_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2370#033[00m
Oct  7 16:13:36 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:36.988 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3eef3b4a-3934-4097-8789-371206e3052e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:36 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:36.990 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041 namespace which is not needed anymore#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.992 2 DEBUG nova.virt.libvirt.driver [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Start waiting for the detach event from libvirt for device tapd400112d-f8 with device alias net1 for instance f2a4ae00-d828-4178-880f-cc034629d96e _detach_from_live_and_wait_for_event /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:2599#033[00m
Oct  7 16:13:36 np0005474864 nova_compute[192593]: 2025-10-07 20:13:36.993 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.004 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface>not found in domain: <domain type='kvm' id='5'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <name>instance-0000000f</name>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <uuid>f2a4ae00-d828-4178-880f-cc034629d96e</uuid>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:creationTime>2025-10-07 20:13:33</nova:creationTime>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:flavor name="m1.nano">
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:memory>128</nova:memory>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:disk>1</nova:disk>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:swap>0</nova:swap>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:vcpus>1</nova:vcpus>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </nova:flavor>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:owner>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </nova:owner>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:ports>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:port uuid="d400112d-f8c8-435e-a3c9-93cf89fe4cb2">
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.22" ipVersion="4"/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </nova:ports>
Oct  7 16:13:37 np0005474864 nova_compute[192593]: </nova:instance>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <memory unit='KiB'>131072</memory>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <vcpu placement='static'>1</vcpu>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <resource>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <partition>/machine</partition>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </resource>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <sysinfo type='smbios'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <entry name='manufacturer'>RDO</entry>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <entry name='product'>OpenStack Compute</entry>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <entry name='serial'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <entry name='uuid'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <entry name='family'>Virtual Machine</entry>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <boot dev='hd'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <smbios mode='sysinfo'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <vmcoreinfo state='on'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <cpu mode='custom' match='exact' check='full'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <model fallback='forbid'>Nehalem</model>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <feature policy='require' name='x2apic'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <feature policy='require' name='hypervisor'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <feature policy='require' name='vme'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <clock offset='utc'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <timer name='pit' tickpolicy='delay'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <timer name='hpet' present='no'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <on_poweroff>destroy</on_poweroff>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <on_reboot>restart</on_reboot>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <on_crash>destroy</on_crash>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <disk type='file' device='disk'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk' index='2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <backingStore type='file' index='3'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:        <format type='raw'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:        <source file='/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:        <backingStore/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      </backingStore>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target dev='vda' bus='virtio'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='virtio-disk0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <disk type='file' device='cdrom'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <driver name='qemu' type='raw' cache='none'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config' index='1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <backingStore/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target dev='sda' bus='sata'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <readonly/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='sata0-0-0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='0' model='pcie-root'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pcie.0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='1' port='0x10'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='2' port='0x11'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='3' port='0x12'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.3'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='4' port='0x13'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.4'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='5' port='0x14'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.5'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='6' port='0x15'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.6'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='7' port='0x16'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.7'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='8' port='0x17'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.8'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='9' port='0x18'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.9'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='10' port='0x19'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.10'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='11' port='0x1a'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.11'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='12' port='0x1b'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.12'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='13' port='0x1c'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.13'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='14' port='0x1d'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.14'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='15' port='0x1e'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.15'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='16' port='0x1f'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.16'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='17' port='0x20'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.17'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='18' port='0x21'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.18'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='19' port='0x22'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.19'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='20' port='0x23'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.20'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='21' port='0x24'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.21'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='22' port='0x25'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.22'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='23' port='0x26'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.23'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='24' port='0x27'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.24'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target chassis='25' port='0x28'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.25'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model name='pcie-pci-bridge'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='pci.26'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='usb'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <controller type='sata' index='0'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='ide'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <interface type='ethernet'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <mac address='fa:16:3e:f0:7b:b2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target dev='tap63b103d2-ef'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model type='virtio'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <driver name='vhost' rx_queue_size='512'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <mtu size='1442'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='net0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <serial type='pty'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target type='isa-serial' port='0'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:        <model name='isa-serial'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      </target>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <console type='pty' tty='/dev/pts/0'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <target type='serial' port='0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </console>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <input type='tablet' bus='usb'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='input0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='usb' bus='0' port='1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <input type='mouse' bus='ps2'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='input1'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <input type='keyboard' bus='ps2'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='input2'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <listen type='address' address='::0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <audio id='1' type='none'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <model type='virtio' heads='1' primary='yes'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='video0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <watchdog model='itco' action='reset'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='watchdog0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </watchdog>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <memballoon model='virtio'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <stats period='10'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='balloon0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <rng model='virtio'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <backend model='random'>/dev/urandom</backend>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <alias name='rng0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <label>system_u:system_r:svirt_t:s0:c236,c324</label>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c236,c324</imagelabel>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <label>+107:+107</label>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <imagelabel>+107:+107</imagelabel>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:37 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:13:37 np0005474864 nova_compute[192593]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.004 2 INFO nova.virt.libvirt.driver [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully detached device tapd400112d-f8 from instance f2a4ae00-d828-4178-880f-cc034629d96e from the live domain config.#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.005 2 DEBUG nova.virt.libvirt.vif [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:54Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.006 2 DEBUG nova.network.os_vif_util [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.007 2 DEBUG nova.network.os_vif_util [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.007 2 DEBUG os_vif [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.009 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.010 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd400112d-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.031 2 INFO os_vif [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8')#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.032 2 DEBUG nova.virt.libvirt.guest [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:creationTime>2025-10-07 20:13:37</nova:creationTime>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:flavor name="m1.nano">
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:memory>128</nova:memory>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:disk>1</nova:disk>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:swap>0</nova:swap>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:vcpus>1</nova:vcpus>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </nova:flavor>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:owner>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </nova:owner>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  <nova:ports>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:13:37 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:37 np0005474864 nova_compute[192593]:  </nova:ports>
Oct  7 16:13:37 np0005474864 nova_compute[192593]: </nova:instance>
Oct  7 16:13:37 np0005474864 nova_compute[192593]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  7 16:13:37 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [NOTICE]   (223000) : haproxy version is 2.8.14-c23fe91
Oct  7 16:13:37 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [NOTICE]   (223000) : path to executable is /usr/sbin/haproxy
Oct  7 16:13:37 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [WARNING]  (223000) : Exiting Master process...
Oct  7 16:13:37 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [WARNING]  (223000) : Exiting Master process...
Oct  7 16:13:37 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [ALERT]    (223000) : Current worker (223002) exited with code 143 (Terminated)
Oct  7 16:13:37 np0005474864 neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041[222996]: [WARNING]  (223000) : All workers exited. Exiting... (0)
Oct  7 16:13:37 np0005474864 systemd[1]: libpod-4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638.scope: Deactivated successfully.
Oct  7 16:13:37 np0005474864 podman[223034]: 2025-10-07 20:13:37.172941489 +0000 UTC m=+0.047288358 container died 4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:13:37 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638-userdata-shm.mount: Deactivated successfully.
Oct  7 16:13:37 np0005474864 systemd[1]: var-lib-containers-storage-overlay-c6ed1782d16cb1274bad95d4b2b4d6e2a5eafbf934d2a876e65b06738b2e2402-merged.mount: Deactivated successfully.
Oct  7 16:13:37 np0005474864 podman[223034]: 2025-10-07 20:13:37.213555504 +0000 UTC m=+0.087902363 container cleanup 4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:13:37 np0005474864 systemd[1]: libpod-conmon-4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638.scope: Deactivated successfully.
Oct  7 16:13:37 np0005474864 podman[223066]: 2025-10-07 20:13:37.278026924 +0000 UTC m=+0.043385556 container remove 4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.284 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[11d69198-8a56-4bc9-9c8d-94fd846aae52]: (4, ('Tue Oct  7 08:13:37 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041 (4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638)\n4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638\nTue Oct  7 08:13:37 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041 (4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638)\n4bd39f2002309b32b4c72fe9c4f64425a1a772b4910b5b448bcaccc665662638\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.286 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cfb408ea-3917-4b9b-b569-7aa4169b45f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.287 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc5280796-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:37 np0005474864 kernel: tapc5280796-80: left promiscuous mode
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.304 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a30ba2b3-23ad-4b2e-a044-c2d88e638b7d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.336 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[97874b54-90ee-41da-9f33-09ddb1966e1d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.337 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[54c1203a-89e5-456d-9289-05d6d8db6ca8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.353 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[71cb4663-4fdd-433c-86c8-531f6ad99f0a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 364957, 'reachable_time': 15365, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223081, 'error': None, 'target': 'ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.356 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c5280796-8b7c-4945-a626-715e8ea6e041 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:13:37 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:37.356 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[db38a7eb-d41c-4955-9696-c71bd016b6ce]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:37 np0005474864 systemd[1]: run-netns-ovnmeta\x2dc5280796\x2d8b7c\x2d4945\x2da626\x2d715e8ea6e041.mount: Deactivated successfully.
Oct  7 16:13:37 np0005474864 nova_compute[192593]: 2025-10-07 20:13:37.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:38 np0005474864 podman[223082]: 2025-10-07 20:13:38.380838916 +0000 UTC m=+0.074753186 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.534 2 DEBUG oslo_concurrency.lockutils [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.535 2 DEBUG oslo_concurrency.lockutils [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.535 2 DEBUG nova.network.neutron [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.793 2 DEBUG nova.compute.manager [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-unplugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.793 2 DEBUG oslo_concurrency.lockutils [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.793 2 DEBUG oslo_concurrency.lockutils [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.794 2 DEBUG oslo_concurrency.lockutils [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.794 2 DEBUG nova.compute.manager [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] No waiting events found dispatching network-vif-unplugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.794 2 WARNING nova.compute.manager [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received unexpected event network-vif-unplugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.794 2 DEBUG nova.compute.manager [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.794 2 DEBUG oslo_concurrency.lockutils [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.795 2 DEBUG oslo_concurrency.lockutils [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.795 2 DEBUG oslo_concurrency.lockutils [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.795 2 DEBUG nova.compute.manager [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] No waiting events found dispatching network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:13:38 np0005474864 nova_compute[192593]: 2025-10-07 20:13:38.795 2 WARNING nova.compute.manager [req-4670fafc-38df-4970-b6e4-4ace985ca0cc req-69302cfb-6a81-4395-b6b1-c0eb632a441d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received unexpected event network-vif-plugged-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:13:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:39Z|00099|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:13:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:39Z|00100|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:13:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:39Z|00101|binding|INFO|Releasing lport f34158c9-a766-4691-8248-3424f7b7ca88 from this chassis (sb_readonly=0)
Oct  7 16:13:39 np0005474864 nova_compute[192593]: 2025-10-07 20:13:39.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.652 2 DEBUG nova.compute.manager [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-deleted-d400112d-f8c8-435e-a3c9-93cf89fe4cb2 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.653 2 INFO nova.compute.manager [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Neutron deleted interface d400112d-f8c8-435e-a3c9-93cf89fe4cb2; detaching it from the instance and deleting it from the info cache#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.653 2 DEBUG nova.network.neutron [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": null, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.680 2 DEBUG nova.objects.instance [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lazy-loading 'system_metadata' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.718 2 DEBUG nova.objects.instance [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lazy-loading 'flavor' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.739 2 DEBUG nova.virt.libvirt.vif [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:54Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.739 2 DEBUG nova.network.os_vif_util [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Converting VIF {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.740 2 DEBUG nova.network.os_vif_util [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.745 2 DEBUG nova.virt.libvirt.guest [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.751 2 DEBUG nova.virt.libvirt.guest [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface>not found in domain: <domain type='kvm' id='5'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <name>instance-0000000f</name>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <uuid>f2a4ae00-d828-4178-880f-cc034629d96e</uuid>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:creationTime>2025-10-07 20:13:37</nova:creationTime>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:flavor name="m1.nano">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:memory>128</nova:memory>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:disk>1</nova:disk>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:swap>0</nova:swap>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:vcpus>1</nova:vcpus>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:flavor>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:owner>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:owner>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:ports>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:ports>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: </nova:instance>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <memory unit='KiB'>131072</memory>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <vcpu placement='static'>1</vcpu>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <resource>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <partition>/machine</partition>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </resource>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <sysinfo type='smbios'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='manufacturer'>RDO</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='product'>OpenStack Compute</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='serial'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='uuid'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='family'>Virtual Machine</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <boot dev='hd'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <smbios mode='sysinfo'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <vmcoreinfo state='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <cpu mode='custom' match='exact' check='full'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <model fallback='forbid'>Nehalem</model>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <feature policy='require' name='x2apic'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <feature policy='require' name='hypervisor'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <feature policy='require' name='vme'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <clock offset='utc'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <timer name='pit' tickpolicy='delay'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <timer name='hpet' present='no'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <on_poweroff>destroy</on_poweroff>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <on_reboot>restart</on_reboot>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <on_crash>destroy</on_crash>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <disk type='file' device='disk'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk' index='2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <backingStore type='file' index='3'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <format type='raw'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <source file='/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <backingStore/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      </backingStore>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target dev='vda' bus='virtio'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='virtio-disk0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <disk type='file' device='cdrom'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <driver name='qemu' type='raw' cache='none'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config' index='1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <backingStore/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target dev='sda' bus='sata'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <readonly/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='sata0-0-0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='0' model='pcie-root'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pcie.0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='1' port='0x10'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='2' port='0x11'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='3' port='0x12'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='4' port='0x13'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='5' port='0x14'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='6' port='0x15'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='7' port='0x16'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='8' port='0x17'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.8'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='9' port='0x18'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.9'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='10' port='0x19'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.10'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='11' port='0x1a'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.11'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='12' port='0x1b'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.12'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='13' port='0x1c'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.13'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='14' port='0x1d'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.14'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='15' port='0x1e'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.15'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='16' port='0x1f'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.16'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='17' port='0x20'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.17'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='18' port='0x21'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.18'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='19' port='0x22'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.19'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='20' port='0x23'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.20'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='21' port='0x24'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.21'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='22' port='0x25'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.22'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='23' port='0x26'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.23'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='24' port='0x27'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.24'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='25' port='0x28'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.25'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-pci-bridge'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.26'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='usb'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='sata' index='0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='ide'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <interface type='ethernet'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <mac address='fa:16:3e:f0:7b:b2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target dev='tap63b103d2-ef'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model type='virtio'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <driver name='vhost' rx_queue_size='512'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <mtu size='1442'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='net0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <serial type='pty'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target type='isa-serial' port='0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <model name='isa-serial'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      </target>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <console type='pty' tty='/dev/pts/0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target type='serial' port='0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </console>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <input type='tablet' bus='usb'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='input0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='usb' bus='0' port='1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <input type='mouse' bus='ps2'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='input1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <input type='keyboard' bus='ps2'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='input2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <listen type='address' address='::0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <audio id='1' type='none'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model type='virtio' heads='1' primary='yes'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='video0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <watchdog model='itco' action='reset'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='watchdog0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </watchdog>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <memballoon model='virtio'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <stats period='10'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='balloon0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <rng model='virtio'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <backend model='random'>/dev/urandom</backend>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='rng0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <label>system_u:system_r:svirt_t:s0:c236,c324</label>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c236,c324</imagelabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <label>+107:+107</label>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <imagelabel>+107:+107</imagelabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.752 2 DEBUG nova.virt.libvirt.guest [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] looking for interface given config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface> get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:257#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.757 2 DEBUG nova.virt.libvirt.guest [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] interface for config: <interface type="ethernet"><mac address="fa:16:3e:8b:79:39"/><model type="virtio"/><driver name="vhost" rx_queue_size="512"/><mtu size="1442"/><target dev="tapd400112d-f8"/></interface>not found in domain: <domain type='kvm' id='5'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <name>instance-0000000f</name>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <uuid>f2a4ae00-d828-4178-880f-cc034629d96e</uuid>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1" xmlns:instance="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:creationTime>2025-10-07 20:13:37</nova:creationTime>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:flavor name="m1.nano">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:memory>128</nova:memory>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:disk>1</nova:disk>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:swap>0</nova:swap>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:vcpus>1</nova:vcpus>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:flavor>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:owner>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:owner>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:ports>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:ports>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: </nova:instance>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <memory unit='KiB'>131072</memory>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <currentMemory unit='KiB'>131072</currentMemory>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <vcpu placement='static'>1</vcpu>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <resource>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <partition>/machine</partition>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </resource>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <sysinfo type='smbios'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='manufacturer'>RDO</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='product'>OpenStack Compute</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='version'>27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='serial'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='uuid'>f2a4ae00-d828-4178-880f-cc034629d96e</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <entry name='family'>Virtual Machine</entry>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <type arch='x86_64' machine='pc-q35-rhel9.6.0'>hvm</type>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <boot dev='hd'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <smbios mode='sysinfo'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <vmcoreinfo state='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <cpu mode='custom' match='exact' check='full'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <model fallback='forbid'>Nehalem</model>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <topology sockets='1' dies='1' clusters='1' cores='1' threads='1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <feature policy='require' name='x2apic'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <feature policy='require' name='hypervisor'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <feature policy='require' name='vme'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <clock offset='utc'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <timer name='pit' tickpolicy='delay'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <timer name='rtc' tickpolicy='catchup'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <timer name='hpet' present='no'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <on_poweroff>destroy</on_poweroff>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <on_reboot>restart</on_reboot>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <on_crash>destroy</on_crash>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <emulator>/usr/libexec/qemu-kvm</emulator>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <disk type='file' device='disk'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <driver name='qemu' type='qcow2' cache='none'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk' index='2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <backingStore type='file' index='3'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <format type='raw'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <source file='/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <backingStore/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      </backingStore>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target dev='vda' bus='virtio'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='virtio-disk0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x03' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <disk type='file' device='cdrom'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <driver name='qemu' type='raw' cache='none'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/disk.config' index='1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <backingStore/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target dev='sda' bus='sata'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <readonly/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='sata0-0-0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='drive' controller='0' bus='0' target='0' unit='0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='0' model='pcie-root'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pcie.0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='1' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='1' port='0x10'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x0' multifunction='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='2' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='2' port='0x11'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='3' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='3' port='0x12'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='4' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='4' port='0x13'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='5' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='5' port='0x14'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='6' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='6' port='0x15'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='7' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='7' port='0x16'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='8' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='8' port='0x17'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.8'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x02' function='0x7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='9' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='9' port='0x18'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.9'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x0' multifunction='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='10' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='10' port='0x19'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.10'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='11' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='11' port='0x1a'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.11'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='12' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='12' port='0x1b'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.12'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='13' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='13' port='0x1c'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.13'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='14' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='14' port='0x1d'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.14'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='15' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='15' port='0x1e'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.15'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='16' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='16' port='0x1f'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.16'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x03' function='0x7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='17' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='17' port='0x20'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.17'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x0' multifunction='on'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='18' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='18' port='0x21'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.18'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='19' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='19' port='0x22'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.19'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='20' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='20' port='0x23'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.20'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x3'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='21' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='21' port='0x24'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.21'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x4'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='22' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='22' port='0x25'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.22'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x5'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='23' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='23' port='0x26'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.23'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x6'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='24' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='24' port='0x27'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.24'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x04' function='0x7'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='25' model='pcie-root-port'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-root-port'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target chassis='25' port='0x28'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.25'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x05' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='pci' index='26' model='pcie-to-pci-bridge'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model name='pcie-pci-bridge'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='pci.26'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x01' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='usb' index='0' model='piix3-uhci'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='usb'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x1a' slot='0x01' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <controller type='sata' index='0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='ide'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x1f' function='0x2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </controller>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <interface type='ethernet'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <mac address='fa:16:3e:f0:7b:b2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target dev='tap63b103d2-ef'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model type='virtio'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <driver name='vhost' rx_queue_size='512'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <mtu size='1442'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='net0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x02' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <serial type='pty'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target type='isa-serial' port='0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:        <model name='isa-serial'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      </target>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <console type='pty' tty='/dev/pts/0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <source path='/dev/pts/0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <log file='/var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e/console.log' append='off'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <target type='serial' port='0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='serial0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </console>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <input type='tablet' bus='usb'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='input0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='usb' bus='0' port='1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <input type='mouse' bus='ps2'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='input1'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <input type='keyboard' bus='ps2'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='input2'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </input>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <graphics type='vnc' port='5900' autoport='yes' listen='::0'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <listen type='address' address='::0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </graphics>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <audio id='1' type='none'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <model type='virtio' heads='1' primary='yes'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='video0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x00' slot='0x01' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <watchdog model='itco' action='reset'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='watchdog0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </watchdog>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <memballoon model='virtio'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <stats period='10'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='balloon0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x04' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <rng model='virtio'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <backend model='random'>/dev/urandom</backend>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <alias name='rng0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <address type='pci' domain='0x0000' bus='0x05' slot='0x00' function='0x0'/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='selinux' relabel='yes'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <label>system_u:system_r:svirt_t:s0:c236,c324</label>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <imagelabel>system_u:object_r:svirt_image_t:s0:c236,c324</imagelabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <seclabel type='dynamic' model='dac' relabel='yes'>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <label>+107:+107</label>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <imagelabel>+107:+107</imagelabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </seclabel>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: get_interface_by_cfg /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:282#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.757 2 WARNING nova.virt.libvirt.driver [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Detaching interface fa:16:3e:8b:79:39 failed because the device is no longer found on the guest.: nova.exception.DeviceNotFound: Device 'tapd400112d-f8' not found.#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.759 2 DEBUG nova.virt.libvirt.vif [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata=<?>,migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:54Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.759 2 DEBUG nova.network.os_vif_util [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Converting VIF {"id": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "address": "fa:16:3e:8b:79:39", "network": {"id": "c5280796-8b7c-4945-a626-715e8ea6e041", "bridge": "br-int", "label": "tempest-network-smoke--775622979", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": null, "type": "gateway", "version": null, "meta": {}}, "ips": [{"address": "10.100.0.22", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapd400112d-f8", "ovs_interfaceid": "d400112d-f8c8-435e-a3c9-93cf89fe4cb2", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.760 2 DEBUG nova.network.os_vif_util [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.761 2 DEBUG os_vif [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.764 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd400112d-f8, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.764 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.768 2 INFO os_vif [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8b:79:39,bridge_name='br-int',has_traffic_filtering=True,id=d400112d-f8c8-435e-a3c9-93cf89fe4cb2,network=Network(c5280796-8b7c-4945-a626-715e8ea6e041),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapd400112d-f8')#033[00m
Oct  7 16:13:40 np0005474864 nova_compute[192593]: 2025-10-07 20:13:40.769 2 DEBUG nova.virt.libvirt.guest [req-0b00436f-ca4b-4fc1-8aa4-d020b8b8900a req-37658ecf-293f-48ec-ae18-7aa1365c2c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] set metadata xml: <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:name>tempest-TestNetworkBasicOps-server-93472678</nova:name>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:creationTime>2025-10-07 20:13:40</nova:creationTime>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:flavor name="m1.nano">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:memory>128</nova:memory>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:disk>1</nova:disk>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:swap>0</nova:swap>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:vcpus>1</nova:vcpus>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:flavor>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:owner>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:owner>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  <nova:ports>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    <nova:port uuid="63b103d2-ef83-41aa-9080-3137adafe387">
Oct  7 16:13:40 np0005474864 nova_compute[192593]:      <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:    </nova:port>
Oct  7 16:13:40 np0005474864 nova_compute[192593]:  </nova:ports>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: </nova:instance>
Oct  7 16:13:40 np0005474864 nova_compute[192593]: set_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:359#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.254 2 INFO nova.network.neutron [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Port d400112d-f8c8-435e-a3c9-93cf89fe4cb2 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.254 2 DEBUG nova.network.neutron [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.280 2 DEBUG oslo_concurrency.lockutils [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.321 2 DEBUG oslo_concurrency.lockutils [None req-bf2be903-46b6-4d57-ad15-8ccc2cd30e60 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "interface-f2a4ae00-d828-4178-880f-cc034629d96e-d400112d-f8c8-435e-a3c9-93cf89fe4cb2" "released" by "nova.compute.manager.ComputeManager.detach_interface.<locals>.do_detach_interface" :: held 4.537s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.792 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.792 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.792 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.793 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.793 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.794 2 INFO nova.compute.manager [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Terminating instance#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.795 2 DEBUG nova.compute.manager [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:13:41 np0005474864 kernel: tap63b103d2-ef (unregistering): left promiscuous mode
Oct  7 16:13:41 np0005474864 NetworkManager[51631]: <info>  [1759868021.8236] device (tap63b103d2-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.831 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:41 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:41Z|00102|binding|INFO|Releasing lport 63b103d2-ef83-41aa-9080-3137adafe387 from this chassis (sb_readonly=0)
Oct  7 16:13:41 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:41Z|00103|binding|INFO|Setting lport 63b103d2-ef83-41aa-9080-3137adafe387 down in Southbound
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.833 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:41 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:41Z|00104|binding|INFO|Removing iface tap63b103d2-ef ovn-installed in OVS
Oct  7 16:13:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:41.844 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f0:7b:b2 10.100.0.9'], port_security=['fa:16:3e:f0:7b:b2 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'f2a4ae00-d828-4178-880f-cc034629d96e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'd4fc7d5a-f2f2-4b9c-aee9-ab5a55c88c50', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=71790900-696c-48c8-9845-d2f1ae8dbfc4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=63b103d2-ef83-41aa-9080-3137adafe387) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:13:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:41.846 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 63b103d2-ef83-41aa-9080-3137adafe387 in datapath b153978d-a2d5-4c7d-8ff5-8249927e8e0f unbound from our chassis#033[00m
Oct  7 16:13:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:41.849 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b153978d-a2d5-4c7d-8ff5-8249927e8e0f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:13:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:41.850 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[658d1b50-7461-4ebb-9758-7c8405d0adf8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:41.851 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f namespace which is not needed anymore#033[00m
Oct  7 16:13:41 np0005474864 nova_compute[192593]: 2025-10-07 20:13:41.862 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:41 np0005474864 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000f.scope: Deactivated successfully.
Oct  7 16:13:41 np0005474864 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000f.scope: Consumed 14.935s CPU time.
Oct  7 16:13:41 np0005474864 systemd-machined[152586]: Machine qemu-5-instance-0000000f terminated.
Oct  7 16:13:41 np0005474864 podman[223103]: 2025-10-07 20:13:41.964648909 +0000 UTC m=+0.108244027 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.012 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.020 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.029 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f[222358]: [NOTICE]   (222362) : haproxy version is 2.8.14-c23fe91
Oct  7 16:13:42 np0005474864 neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f[222358]: [NOTICE]   (222362) : path to executable is /usr/sbin/haproxy
Oct  7 16:13:42 np0005474864 neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f[222358]: [WARNING]  (222362) : Exiting Master process...
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.071 2 INFO nova.virt.libvirt.driver [-] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Instance destroyed successfully.#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.074 2 DEBUG nova.objects.instance [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'resources' on Instance uuid f2a4ae00-d828-4178-880f-cc034629d96e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:13:42 np0005474864 neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f[222358]: [ALERT]    (222362) : Current worker (222364) exited with code 143 (Terminated)
Oct  7 16:13:42 np0005474864 neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f[222358]: [WARNING]  (222362) : All workers exited. Exiting... (0)
Oct  7 16:13:42 np0005474864 systemd[1]: libpod-890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495.scope: Deactivated successfully.
Oct  7 16:13:42 np0005474864 podman[223149]: 2025-10-07 20:13:42.087463373 +0000 UTC m=+0.087458591 container died 890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.092 2 DEBUG nova.virt.libvirt.vif [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:33Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-93472678',display_name='tempest-TestNetworkBasicOps-server-93472678',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-93472678',id=15,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNqIkKxPhTDFrF1mYsLGT4VoIX/sFe8HBnealVia+nFLppfgO0Xe3SvxixBmFjjO1nG4Niu4XVzOpfWewXCUpRStprU6Q2hEwG42+Uag+EI9HED37Cp6MNeCsqGkhMnNLA==',key_name='tempest-TestNetworkBasicOps-2102187017',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:54Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-93oohg0u',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:54Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f2a4ae00-d828-4178-880f-cc034629d96e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.093 2 DEBUG nova.network.os_vif_util [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.185", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.093 2 DEBUG nova.network.os_vif_util [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f0:7b:b2,bridge_name='br-int',has_traffic_filtering=True,id=63b103d2-ef83-41aa-9080-3137adafe387,network=Network(b153978d-a2d5-4c7d-8ff5-8249927e8e0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63b103d2-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.094 2 DEBUG os_vif [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:7b:b2,bridge_name='br-int',has_traffic_filtering=True,id=63b103d2-ef83-41aa-9080-3137adafe387,network=Network(b153978d-a2d5-4c7d-8ff5-8249927e8e0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63b103d2-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.095 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63b103d2-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.096 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.101 2 INFO os_vif [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f0:7b:b2,bridge_name='br-int',has_traffic_filtering=True,id=63b103d2-ef83-41aa-9080-3137adafe387,network=Network(b153978d-a2d5-4c7d-8ff5-8249927e8e0f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63b103d2-ef')#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.101 2 INFO nova.virt.libvirt.driver [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Deleting instance files /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e_del#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.102 2 INFO nova.virt.libvirt.driver [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Deletion of /var/lib/nova/instances/f2a4ae00-d828-4178-880f-cc034629d96e_del complete#033[00m
Oct  7 16:13:42 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495-userdata-shm.mount: Deactivated successfully.
Oct  7 16:13:42 np0005474864 systemd[1]: var-lib-containers-storage-overlay-a49fbdc4e04a8c0abd534c13cf95f7aac040979398de302462cafb60d4fabac2-merged.mount: Deactivated successfully.
Oct  7 16:13:42 np0005474864 podman[223149]: 2025-10-07 20:13:42.140874765 +0000 UTC m=+0.140869973 container cleanup 890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  7 16:13:42 np0005474864 systemd[1]: libpod-conmon-890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495.scope: Deactivated successfully.
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.181 2 INFO nova.compute.manager [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.183 2 DEBUG oslo.service.loopingcall [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.184 2 DEBUG nova.compute.manager [-] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.184 2 DEBUG nova.network.neutron [-] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:13:42 np0005474864 podman[223196]: 2025-10-07 20:13:42.230431615 +0000 UTC m=+0.060287161 container remove 890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.237 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cdd84b52-e5c7-47e6-96bb-430a6a0c1a59]: (4, ('Tue Oct  7 08:13:41 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f (890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495)\n890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495\nTue Oct  7 08:13:42 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f (890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495)\n890454fb28a95b3a710f1657b6985918bf9571a5d0e81eb321902f36d262a495\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.239 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[27f888da-43b0-4df6-b521-ed6c36a1971f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.240 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb153978d-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 kernel: tapb153978d-a0: left promiscuous mode
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.261 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ec49e686-cd46-4ea7-a71e-e5ffaea7c882]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.301 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f69279b0-b8bc-4c10-9a13-0bb26b3c0079]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.303 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[32894f9f-1eb0-4204-93b1-5469df7801de]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.330 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6be52018-8018-41a3-99bd-9798a36df531]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 360049, 'reachable_time': 34471, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223211, 'error': None, 'target': 'ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.332 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b153978d-a2d5-4c7d-8ff5-8249927e8e0f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:13:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:42.332 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[3d02bc4e-8ee0-460a-87e9-e0b2b5d6e56d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:13:42 np0005474864 systemd[1]: run-netns-ovnmeta\x2db153978d\x2da2d5\x2d4c7d\x2d8ff5\x2d8249927e8e0f.mount: Deactivated successfully.
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.857 2 DEBUG nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-changed-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.858 2 DEBUG nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing instance network info cache due to event network-changed-63b103d2-ef83-41aa-9080-3137adafe387. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.858 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.859 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:13:42 np0005474864 nova_compute[192593]: 2025-10-07 20:13:42.859 2 DEBUG nova.network.neutron [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Refreshing network info cache for port 63b103d2-ef83-41aa-9080-3137adafe387 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.393 2 DEBUG nova.network.neutron [-] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.414 2 INFO nova.compute.manager [-] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Took 1.23 seconds to deallocate network for instance.#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.463 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.463 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.570 2 DEBUG nova.compute.provider_tree [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.589 2 DEBUG nova.scheduler.client.report [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.621 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.657 2 INFO nova.scheduler.client.report [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Deleted allocations for instance f2a4ae00-d828-4178-880f-cc034629d96e#033[00m
Oct  7 16:13:43 np0005474864 nova_compute[192593]: 2025-10-07 20:13:43.727 2 DEBUG oslo_concurrency.lockutils [None req-978b975a-e9d7-4b4b-9b3f-3f9dcf553da3 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.934s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.120 2 DEBUG nova.network.neutron [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updated VIF entry in instance network info cache for port 63b103d2-ef83-41aa-9080-3137adafe387. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.120 2 DEBUG nova.network.neutron [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Updating instance_info_cache with network_info: [{"id": "63b103d2-ef83-41aa-9080-3137adafe387", "address": "fa:16:3e:f0:7b:b2", "network": {"id": "b153978d-a2d5-4c7d-8ff5-8249927e8e0f", "bridge": "br-int", "label": "tempest-network-smoke--1424478439", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63b103d2-ef", "ovs_interfaceid": "63b103d2-ef83-41aa-9080-3137adafe387", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.138 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-f2a4ae00-d828-4178-880f-cc034629d96e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.138 2 DEBUG nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-unplugged-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.139 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.139 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.140 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.140 2 DEBUG nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] No waiting events found dispatching network-vif-unplugged-63b103d2-ef83-41aa-9080-3137adafe387 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.140 2 DEBUG nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-unplugged-63b103d2-ef83-41aa-9080-3137adafe387 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.141 2 DEBUG nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.141 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.141 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.142 2 DEBUG oslo_concurrency.lockutils [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f2a4ae00-d828-4178-880f-cc034629d96e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.142 2 DEBUG nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] No waiting events found dispatching network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:13:44 np0005474864 nova_compute[192593]: 2025-10-07 20:13:44.142 2 WARNING nova.compute.manager [req-ec534560-227e-4d52-8d56-58915d26779f req-42c35061-9428-4288-8f92-cadbe9bb4448 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received unexpected event network-vif-plugged-63b103d2-ef83-41aa-9080-3137adafe387 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:13:45 np0005474864 nova_compute[192593]: 2025-10-07 20:13:45.101 2 DEBUG nova.compute.manager [req-f8dd866e-42dd-464d-be1a-9ad923c1a009 req-7cb0f066-2a71-4a7c-aaeb-5faa1a56f7b9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Received event network-vif-deleted-63b103d2-ef83-41aa-9080-3137adafe387 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:13:45 np0005474864 podman[223212]: 2025-10-07 20:13:45.403323649 +0000 UTC m=+0.090303532 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  7 16:13:47 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:47Z|00105|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:13:47 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:47Z|00106|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:13:47 np0005474864 nova_compute[192593]: 2025-10-07 20:13:47.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:47 np0005474864 nova_compute[192593]: 2025-10-07 20:13:47.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:47 np0005474864 nova_compute[192593]: 2025-10-07 20:13:47.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:49 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:49.788 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:13:49 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:49.790 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:13:49 np0005474864 nova_compute[192593]: 2025-10-07 20:13:49.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:50 np0005474864 nova_compute[192593]: 2025-10-07 20:13:50.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:51 np0005474864 nova_compute[192593]: 2025-10-07 20:13:51.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:13:51.792 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:13:52 np0005474864 nova_compute[192593]: 2025-10-07 20:13:52.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:52 np0005474864 nova_compute[192593]: 2025-10-07 20:13:52.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:55 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:55Z|00107|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:13:55 np0005474864 ovn_controller[94801]: 2025-10-07T20:13:55Z|00108|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:13:55 np0005474864 nova_compute[192593]: 2025-10-07 20:13:55.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:57 np0005474864 nova_compute[192593]: 2025-10-07 20:13:57.068 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868022.066897, f2a4ae00-d828-4178-880f-cc034629d96e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:13:57 np0005474864 nova_compute[192593]: 2025-10-07 20:13:57.069 2 INFO nova.compute.manager [-] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:13:57 np0005474864 nova_compute[192593]: 2025-10-07 20:13:57.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:57 np0005474864 nova_compute[192593]: 2025-10-07 20:13:57.110 2 DEBUG nova.compute.manager [None req-1887dfd7-987c-408d-be21-15c7565858a5 - - - - - -] [instance: f2a4ae00-d828-4178-880f-cc034629d96e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:13:57 np0005474864 nova_compute[192593]: 2025-10-07 20:13:57.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:57 np0005474864 nova_compute[192593]: 2025-10-07 20:13:57.905 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:57 np0005474864 nova_compute[192593]: 2025-10-07 20:13:57.967 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:13:58 np0005474864 podman[223232]: 2025-10-07 20:13:58.400596523 +0000 UTC m=+0.089104928 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:13:58 np0005474864 podman[223233]: 2025-10-07 20:13:58.413371559 +0000 UTC m=+0.093934306 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, config_id=edpm, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., name=ubi9-minimal, io.buildah.version=1.33.7)
Oct  7 16:14:02 np0005474864 nova_compute[192593]: 2025-10-07 20:14:02.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:02 np0005474864 nova_compute[192593]: 2025-10-07 20:14:02.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:04 np0005474864 podman[223280]: 2025-10-07 20:14:04.390451568 +0000 UTC m=+0.072058109 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:14:04 np0005474864 podman[223278]: 2025-10-07 20:14:04.399724244 +0000 UTC m=+0.092622419 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:14:04 np0005474864 podman[223279]: 2025-10-07 20:14:04.410110512 +0000 UTC m=+0.102838042 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Oct  7 16:14:06 np0005474864 nova_compute[192593]: 2025-10-07 20:14:06.735 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:06 np0005474864 nova_compute[192593]: 2025-10-07 20:14:06.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:07 np0005474864 nova_compute[192593]: 2025-10-07 20:14:07.107 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:08 np0005474864 nova_compute[192593]: 2025-10-07 20:14:08.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:09 np0005474864 podman[223344]: 2025-10-07 20:14:09.40678364 +0000 UTC m=+0.091454325 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:14:12 np0005474864 nova_compute[192593]: 2025-10-07 20:14:12.111 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:12 np0005474864 podman[223364]: 2025-10-07 20:14:12.391446403 +0000 UTC m=+0.078425470 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.002 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.763 2 DEBUG nova.compute.manager [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-changed-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.763 2 DEBUG nova.compute.manager [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing instance network info cache due to event network-changed-7ce9ef63-687e-420f-b85d-071abf475fd7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.764 2 DEBUG oslo_concurrency.lockutils [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.764 2 DEBUG oslo_concurrency.lockutils [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.765 2 DEBUG nova.network.neutron [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Refreshing network info cache for port 7ce9ef63-687e-420f-b85d-071abf475fd7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.956 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.957 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.957 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.957 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.958 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.958 2 INFO nova.compute.manager [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Terminating instance#033[00m
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.959 2 DEBUG nova.compute.manager [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:14:13 np0005474864 kernel: tap7ce9ef63-68 (unregistering): left promiscuous mode
Oct  7 16:14:13 np0005474864 NetworkManager[51631]: <info>  [1759868053.9813] device (tap7ce9ef63-68): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:13 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:13Z|00109|binding|INFO|Releasing lport 7ce9ef63-687e-420f-b85d-071abf475fd7 from this chassis (sb_readonly=0)
Oct  7 16:14:13 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:13Z|00110|binding|INFO|Setting lport 7ce9ef63-687e-420f-b85d-071abf475fd7 down in Southbound
Oct  7 16:14:13 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:13Z|00111|binding|INFO|Removing iface tap7ce9ef63-68 ovn-installed in OVS
Oct  7 16:14:13 np0005474864 nova_compute[192593]: 2025-10-07 20:14:13.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.007 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8f:62:17 10.100.0.5'], port_security=['fa:16:3e:8f:62:17 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '1669315a-9455-4ddc-bddf-b5a535be9294', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a513f697-18f2-4f8c-b79e-8feb80b81d11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=03b9f824-0816-49ca-b067-98a9e0b122ad, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=7ce9ef63-687e-420f-b85d-071abf475fd7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.010 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 7ce9ef63-687e-420f-b85d-071abf475fd7 in datapath 48bc8cb5-7112-4ac0-bcc2-12066714d0ea unbound from our chassis#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.014 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 48bc8cb5-7112-4ac0-bcc2-12066714d0ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.016 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fb0fa33d-345a-4530-bff4-b6b05a60c719]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.017 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea namespace which is not needed anymore#033[00m
Oct  7 16:14:14 np0005474864 kernel: tapbe7697b8-38 (unregistering): left promiscuous mode
Oct  7 16:14:14 np0005474864 NetworkManager[51631]: <info>  [1759868054.0251] device (tapbe7697b8-38): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.038 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:14Z|00112|binding|INFO|Releasing lport be7697b8-3851-4db2-8ae0-bc42997f1332 from this chassis (sb_readonly=0)
Oct  7 16:14:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:14Z|00113|binding|INFO|Setting lport be7697b8-3851-4db2-8ae0-bc42997f1332 down in Southbound
Oct  7 16:14:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:14Z|00114|binding|INFO|Removing iface tapbe7697b8-38 ovn-installed in OVS
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.053 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fc:38:3e 2001:db8::f816:3eff:fefc:383e'], port_security=['fa:16:3e:fc:38:3e 2001:db8::f816:3eff:fefc:383e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefc:383e/64', 'neutron:device_id': '1669315a-9455-4ddc-bddf-b5a535be9294', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a513f697-18f2-4f8c-b79e-8feb80b81d11', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d381511a-5493-4cd7-9663-2f55bb48bf91, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=be7697b8-3851-4db2-8ae0-bc42997f1332) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Deactivated successfully.
Oct  7 16:14:14 np0005474864 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d00000010.scope: Consumed 17.946s CPU time.
Oct  7 16:14:14 np0005474864 systemd-machined[152586]: Machine qemu-6-instance-00000010 terminated.
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea[222486]: [NOTICE]   (222490) : haproxy version is 2.8.14-c23fe91
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea[222486]: [NOTICE]   (222490) : path to executable is /usr/sbin/haproxy
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea[222486]: [WARNING]  (222490) : Exiting Master process...
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea[222486]: [ALERT]    (222490) : Current worker (222499) exited with code 143 (Terminated)
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea[222486]: [WARNING]  (222490) : All workers exited. Exiting... (0)
Oct  7 16:14:14 np0005474864 systemd[1]: libpod-f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d.scope: Deactivated successfully.
Oct  7 16:14:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:14Z|00115|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:14:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:14Z|00116|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:14:14 np0005474864 podman[223418]: 2025-10-07 20:14:14.176603822 +0000 UTC m=+0.054116554 container died f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 NetworkManager[51631]: <info>  [1759868054.2183] manager: (tap7ce9ef63-68): new Tun device (/org/freedesktop/NetworkManager/Devices/62)
Oct  7 16:14:14 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d-userdata-shm.mount: Deactivated successfully.
Oct  7 16:14:14 np0005474864 systemd[1]: var-lib-containers-storage-overlay-e90366fa718a2598047dcea30085e935f589e25a16fbf914eb569e1889787251-merged.mount: Deactivated successfully.
Oct  7 16:14:14 np0005474864 podman[223418]: 2025-10-07 20:14:14.400155156 +0000 UTC m=+0.277667918 container cleanup f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:14:14 np0005474864 systemd[1]: libpod-conmon-f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d.scope: Deactivated successfully.
Oct  7 16:14:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:14Z|00117|binding|INFO|Releasing lport 8debe99b-81af-459e-8217-4b06af1ff98e from this chassis (sb_readonly=0)
Oct  7 16:14:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:14Z|00118|binding|INFO|Releasing lport 35331a4a-db8c-4977-9b95-771260e3e40b from this chassis (sb_readonly=0)
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.466 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.470 2 INFO nova.virt.libvirt.driver [-] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Instance destroyed successfully.#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.471 2 DEBUG nova.objects.instance [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'resources' on Instance uuid 1669315a-9455-4ddc-bddf-b5a535be9294 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:14:14 np0005474864 podman[223461]: 2025-10-07 20:14:14.475505408 +0000 UTC m=+0.048529334 container remove f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.481 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7fc947b4-fa8c-4efb-a10f-612605ab7d4a]: (4, ('Tue Oct  7 08:14:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea (f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d)\nf48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d\nTue Oct  7 08:14:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea (f48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d)\nf48a291815918ed8fb996cb95768bea1138ea5ebdadf13e6f19777f5f874a35d\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.482 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ad144ea0-a179-4cff-a8cb-cc16152e5b10]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.483 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap48bc8cb5-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 kernel: tap48bc8cb5-70: left promiscuous mode
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.505 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e62e68ab-c228-4b1d-ab89-a001a9aebc26]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.510 2 DEBUG nova.virt.libvirt.vif [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-993173619',display_name='tempest-TestGettingAddress-server-993173619',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-993173619',id=16,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAMiPzYFMApCNDc9mIgm8Ln/jp3Xg1XJGHFUgqwN9wF6viQJ53hy2WYN1ZdeMyDZf3WAgFTiR2n+wfAYIJY6IB6Pdd0KHjGmouJJqkn9TClJJZ0hKIENBfl0N2NXC/VOKw==',key_name='tempest-TestGettingAddress-1769694523',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mycfmhvq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:59Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=1669315a-9455-4ddc-bddf-b5a535be9294,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.510 2 DEBUG nova.network.os_vif_util [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.240", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.512 2 DEBUG nova.network.os_vif_util [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8f:62:17,bridge_name='br-int',has_traffic_filtering=True,id=7ce9ef63-687e-420f-b85d-071abf475fd7,network=Network(48bc8cb5-7112-4ac0-bcc2-12066714d0ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ce9ef63-68') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.512 2 DEBUG os_vif [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:62:17,bridge_name='br-int',has_traffic_filtering=True,id=7ce9ef63-687e-420f-b85d-071abf475fd7,network=Network(48bc8cb5-7112-4ac0-bcc2-12066714d0ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ce9ef63-68') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.513 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.514 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ce9ef63-68, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.516 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.530 2 INFO os_vif [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8f:62:17,bridge_name='br-int',has_traffic_filtering=True,id=7ce9ef63-687e-420f-b85d-071abf475fd7,network=Network(48bc8cb5-7112-4ac0-bcc2-12066714d0ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ce9ef63-68')#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.530 2 DEBUG nova.virt.libvirt.vif [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:12:39Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-993173619',display_name='tempest-TestGettingAddress-server-993173619',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-993173619',id=16,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAMiPzYFMApCNDc9mIgm8Ln/jp3Xg1XJGHFUgqwN9wF6viQJ53hy2WYN1ZdeMyDZf3WAgFTiR2n+wfAYIJY6IB6Pdd0KHjGmouJJqkn9TClJJZ0hKIENBfl0N2NXC/VOKw==',key_name='tempest-TestGettingAddress-1769694523',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:12:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mycfmhvq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:12:59Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=1669315a-9455-4ddc-bddf-b5a535be9294,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.531 2 DEBUG nova.network.os_vif_util [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.531 2 DEBUG nova.network.os_vif_util [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fc:38:3e,bridge_name='br-int',has_traffic_filtering=True,id=be7697b8-3851-4db2-8ae0-bc42997f1332,network=Network(50d1db2f-7e6a-4b01-96dc-cd47acf22206),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe7697b8-38') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.532 2 DEBUG os_vif [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:38:3e,bridge_name='br-int',has_traffic_filtering=True,id=be7697b8-3851-4db2-8ae0-bc42997f1332,network=Network(50d1db2f-7e6a-4b01-96dc-cd47acf22206),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe7697b8-38') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.533 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.533 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapbe7697b8-38, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.535 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.534 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[11687c02-a75d-49d7-9acd-97538ab8fcbc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.536 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8fd47d97-704a-4756-882f-495f64f4ec60]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.538 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.540 2 INFO os_vif [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fc:38:3e,bridge_name='br-int',has_traffic_filtering=True,id=be7697b8-3851-4db2-8ae0-bc42997f1332,network=Network(50d1db2f-7e6a-4b01-96dc-cd47acf22206),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapbe7697b8-38')#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.541 2 INFO nova.virt.libvirt.driver [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Deleting instance files /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294_del#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.541 2 INFO nova.virt.libvirt.driver [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Deletion of /var/lib/nova/instances/1669315a-9455-4ddc-bddf-b5a535be9294_del complete#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.554 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d2efb5f8-3766-42ac-a646-c767c45f2282]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361078, 'reachable_time': 41485, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223491, 'error': None, 'target': 'ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 systemd[1]: run-netns-ovnmeta\x2d48bc8cb5\x2d7112\x2d4ac0\x2dbcc2\x2d12066714d0ea.mount: Deactivated successfully.
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.558 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-48bc8cb5-7112-4ac0-bcc2-12066714d0ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.558 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2ef39d-c094-4950-8463-5b3a7b4015bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.559 103685 INFO neutron.agent.ovn.metadata.agent [-] Port be7697b8-3851-4db2-8ae0-bc42997f1332 in datapath 50d1db2f-7e6a-4b01-96dc-cd47acf22206 unbound from our chassis#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.561 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 50d1db2f-7e6a-4b01-96dc-cd47acf22206, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.561 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e31abde7-4b79-4d20-ace0-d5aabeef8451]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.562 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206 namespace which is not needed anymore#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.603 2 INFO nova.compute.manager [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Took 0.64 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.603 2 DEBUG oslo.service.loopingcall [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.604 2 DEBUG nova.compute.manager [-] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.604 2 DEBUG nova.network.neutron [-] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206[222574]: [NOTICE]   (222600) : haproxy version is 2.8.14-c23fe91
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206[222574]: [NOTICE]   (222600) : path to executable is /usr/sbin/haproxy
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206[222574]: [WARNING]  (222600) : Exiting Master process...
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206[222574]: [ALERT]    (222600) : Current worker (222616) exited with code 143 (Terminated)
Oct  7 16:14:14 np0005474864 neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206[222574]: [WARNING]  (222600) : All workers exited. Exiting... (0)
Oct  7 16:14:14 np0005474864 systemd[1]: libpod-2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57.scope: Deactivated successfully.
Oct  7 16:14:14 np0005474864 podman[223509]: 2025-10-07 20:14:14.756553721 +0000 UTC m=+0.066003454 container died 2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:14:14 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57-userdata-shm.mount: Deactivated successfully.
Oct  7 16:14:14 np0005474864 systemd[1]: var-lib-containers-storage-overlay-3fa58bd42a183b1a0281e2e1f4efd13479c2bebcf9fd05a792c046c1422813f9-merged.mount: Deactivated successfully.
Oct  7 16:14:14 np0005474864 podman[223509]: 2025-10-07 20:14:14.80287343 +0000 UTC m=+0.112323193 container cleanup 2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:14:14 np0005474864 systemd[1]: libpod-conmon-2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57.scope: Deactivated successfully.
Oct  7 16:14:14 np0005474864 podman[223540]: 2025-10-07 20:14:14.89384661 +0000 UTC m=+0.062632398 container remove 2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.899 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d15c7b48-8d19-4d91-9c84-2ab77758b572]: (4, ('Tue Oct  7 08:14:14 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206 (2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57)\n2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57\nTue Oct  7 08:14:14 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206 (2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57)\n2b05ce08965264a963694492d1b2ecdecd73c1cdcb09823e966c921db459ea57\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.902 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[974fce5a-a036-44a6-b7c3-0dda3f351737]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.904 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50d1db2f-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 kernel: tap50d1db2f-70: left promiscuous mode
Oct  7 16:14:14 np0005474864 nova_compute[192593]: 2025-10-07 20:14:14.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:14.985 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[491907c8-9cd2-4aaf-be09-82b46d857617]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:15.016 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6800e44d-b5e3-420d-89da-67bc4bcf4831]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:15.020 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ee4f32e8-54ae-46d0-b846-62423d6f7ac1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:15.047 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5590bd2c-1e31-40fe-8e48-9e52d844de25]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 361180, 'reachable_time': 30401, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223555, 'error': None, 'target': 'ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:15.049 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-50d1db2f-7e6a-4b01-96dc-cd47acf22206 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:14:15 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:15.050 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[c30dea62-535f-49d1-b558-67aa6a8a08b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:15 np0005474864 systemd[1]: run-netns-ovnmeta\x2d50d1db2f\x2d7e6a\x2d4b01\x2d96dc\x2dcd47acf22206.mount: Deactivated successfully.
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.902 2 DEBUG nova.compute.manager [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-unplugged-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.903 2 DEBUG oslo_concurrency.lockutils [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.903 2 DEBUG oslo_concurrency.lockutils [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.904 2 DEBUG oslo_concurrency.lockutils [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.904 2 DEBUG nova.compute.manager [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] No waiting events found dispatching network-vif-unplugged-7ce9ef63-687e-420f-b85d-071abf475fd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.904 2 DEBUG nova.compute.manager [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-unplugged-7ce9ef63-687e-420f-b85d-071abf475fd7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.905 2 DEBUG nova.compute.manager [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.905 2 DEBUG oslo_concurrency.lockutils [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.906 2 DEBUG oslo_concurrency.lockutils [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.906 2 DEBUG oslo_concurrency.lockutils [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.906 2 DEBUG nova.compute.manager [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] No waiting events found dispatching network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:14:15 np0005474864 nova_compute[192593]: 2025-10-07 20:14:15.906 2 WARNING nova.compute.manager [req-5fcd4a6d-c257-47ba-af7a-5dc6ceffc44c req-6dfe4493-30bd-40be-9158-d135d64cb6cd 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received unexpected event network-vif-plugged-7ce9ef63-687e-420f-b85d-071abf475fd7 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:14:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:16.187 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:16.188 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:16.188 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:16 np0005474864 podman[223556]: 2025-10-07 20:14:16.410414151 +0000 UTC m=+0.093000439 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 16:14:17 np0005474864 nova_compute[192593]: 2025-10-07 20:14:17.217 2 DEBUG nova.network.neutron [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updated VIF entry in instance network info cache for port 7ce9ef63-687e-420f-b85d-071abf475fd7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:14:17 np0005474864 nova_compute[192593]: 2025-10-07 20:14:17.218 2 DEBUG nova.network.neutron [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updating instance_info_cache with network_info: [{"id": "7ce9ef63-687e-420f-b85d-071abf475fd7", "address": "fa:16:3e:8f:62:17", "network": {"id": "48bc8cb5-7112-4ac0-bcc2-12066714d0ea", "bridge": "br-int", "label": "tempest-network-smoke--2080782381", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ce9ef63-68", "ovs_interfaceid": "7ce9ef63-687e-420f-b85d-071abf475fd7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:14:17 np0005474864 nova_compute[192593]: 2025-10-07 20:14:17.337 2 DEBUG oslo_concurrency.lockutils [req-021e155a-3cc4-44ca-bfb0-d9a5ef9879c4 req-a435ffe6-b894-41c2-a8e8-5cd1c1644f2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-1669315a-9455-4ddc-bddf-b5a535be9294" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.063 2 DEBUG nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-unplugged-be7697b8-3851-4db2-8ae0-bc42997f1332 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.064 2 DEBUG oslo_concurrency.lockutils [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.065 2 DEBUG oslo_concurrency.lockutils [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.065 2 DEBUG oslo_concurrency.lockutils [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.066 2 DEBUG nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] No waiting events found dispatching network-vif-unplugged-be7697b8-3851-4db2-8ae0-bc42997f1332 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.066 2 DEBUG nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-unplugged-be7697b8-3851-4db2-8ae0-bc42997f1332 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.067 2 DEBUG nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-deleted-7ce9ef63-687e-420f-b85d-071abf475fd7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.067 2 INFO nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Neutron deleted interface 7ce9ef63-687e-420f-b85d-071abf475fd7; detaching it from the instance and deleting it from the info cache#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.068 2 DEBUG nova.network.neutron [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updating instance_info_cache with network_info: [{"id": "be7697b8-3851-4db2-8ae0-bc42997f1332", "address": "fa:16:3e:fc:38:3e", "network": {"id": "50d1db2f-7e6a-4b01-96dc-cd47acf22206", "bridge": "br-int", "label": "tempest-network-smoke--306467718", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefc:383e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapbe7697b8-38", "ovs_interfaceid": "be7697b8-3851-4db2-8ae0-bc42997f1332", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.108 2 DEBUG nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Detach interface failed, port_id=7ce9ef63-687e-420f-b85d-071abf475fd7, reason: Instance 1669315a-9455-4ddc-bddf-b5a535be9294 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.108 2 DEBUG nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.109 2 DEBUG oslo_concurrency.lockutils [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.109 2 DEBUG oslo_concurrency.lockutils [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.110 2 DEBUG oslo_concurrency.lockutils [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.110 2 DEBUG nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] No waiting events found dispatching network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:14:18 np0005474864 nova_compute[192593]: 2025-10-07 20:14:18.111 2 WARNING nova.compute.manager [req-637877e2-c2c7-421b-b32d-0002661bfd74 req-3ef78fa4-e9c3-4848-ae28-1560fea5cfae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received unexpected event network-vif-plugged-be7697b8-3851-4db2-8ae0-bc42997f1332 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.021 2 DEBUG nova.network.neutron [-] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.055 2 INFO nova.compute.manager [-] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Took 4.45 seconds to deallocate network for instance.#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.114 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.115 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.118 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.180 2 DEBUG nova.compute.provider_tree [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.194 2 DEBUG nova.scheduler.client.report [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.218 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.222 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.223 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.223 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.253 2 INFO nova.scheduler.client.report [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Deleted allocations for instance 1669315a-9455-4ddc-bddf-b5a535be9294#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.333 2 DEBUG oslo_concurrency.lockutils [None req-10ec6931-555e-4040-9ce2-66a4eded7a24 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "1669315a-9455-4ddc-bddf-b5a535be9294" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 5.376s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.471 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.473 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5775MB free_disk=73.46415710449219GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.474 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.474 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.535 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.535 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.539 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.644 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.662 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.686 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:14:19 np0005474864 nova_compute[192593]: 2025-10-07 20:14:19.686 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.212s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:20 np0005474864 nova_compute[192593]: 2025-10-07 20:14:20.226 2 DEBUG nova.compute.manager [req-9368e8de-97b2-4d35-ba9e-67f2e9139501 req-8ace738b-1153-4cc1-8f3e-3fde903c43d1 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Received event network-vif-deleted-be7697b8-3851-4db2-8ae0-bc42997f1332 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:20 np0005474864 nova_compute[192593]: 2025-10-07 20:14:20.687 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:22 np0005474864 nova_compute[192593]: 2025-10-07 20:14:22.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.091 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.134 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.135 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.743 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.744 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.767 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.873 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.874 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.883 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.883 2 INFO nova.compute.claims [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:14:23 np0005474864 nova_compute[192593]: 2025-10-07 20:14:23.998 2 DEBUG nova.compute.provider_tree [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.016 2 DEBUG nova.scheduler.client.report [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.048 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.050 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.128 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.129 2 DEBUG nova.network.neutron [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.165 2 INFO nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.192 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.346 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.348 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.349 2 INFO nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Creating image(s)#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.350 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "/var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.351 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "/var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.352 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "/var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.376 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.463 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.464 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.465 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.480 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.531 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.532 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.553 2 DEBUG nova.policy [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.567 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.568 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.103s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.568 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.625 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.626 2 DEBUG nova.virt.disk.api [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Checking if we can resize image /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.627 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.683 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.684 2 DEBUG nova.virt.disk.api [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Cannot resize image /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.685 2 DEBUG nova.objects.instance [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'migration_context' on Instance uuid 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.714 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.715 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Ensure instance console log exists: /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.716 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.716 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:24 np0005474864 nova_compute[192593]: 2025-10-07 20:14:24.717 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:25 np0005474864 nova_compute[192593]: 2025-10-07 20:14:25.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:25 np0005474864 nova_compute[192593]: 2025-10-07 20:14:25.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:14:27 np0005474864 nova_compute[192593]: 2025-10-07 20:14:27.022 2 DEBUG nova.network.neutron [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Successfully created port: 82d68894-8f3b-4d74-8f89-efe26013d5be _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:14:27 np0005474864 nova_compute[192593]: 2025-10-07 20:14:27.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:14:28 np0005474864 nova_compute[192593]: 2025-10-07 20:14:28.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:29 np0005474864 podman[223593]: 2025-10-07 20:14:29.388925378 +0000 UTC m=+0.078042560 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:14:29 np0005474864 podman[223594]: 2025-10-07 20:14:29.420585257 +0000 UTC m=+0.104605272 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, build-date=2025-08-20T13:12:41, name=ubi9-minimal, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc.)
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.469 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868054.46765, 1669315a-9455-4ddc-bddf-b5a535be9294 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.469 2 INFO nova.compute.manager [-] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.505 2 DEBUG nova.compute.manager [None req-a5411385-4ddd-4021-b3c2-88127124f715 - - - - - -] [instance: 1669315a-9455-4ddc-bddf-b5a535be9294] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.588 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.785 2 DEBUG nova.network.neutron [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Successfully updated port: 82d68894-8f3b-4d74-8f89-efe26013d5be _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.799 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.800 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquired lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:14:29 np0005474864 nova_compute[192593]: 2025-10-07 20:14:29.800 2 DEBUG nova.network.neutron [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:14:30 np0005474864 nova_compute[192593]: 2025-10-07 20:14:30.421 2 DEBUG nova.network.neutron [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:14:30 np0005474864 nova_compute[192593]: 2025-10-07 20:14:30.830 2 DEBUG nova.compute.manager [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Received event network-changed-82d68894-8f3b-4d74-8f89-efe26013d5be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:30 np0005474864 nova_compute[192593]: 2025-10-07 20:14:30.830 2 DEBUG nova.compute.manager [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Refreshing instance network info cache due to event network-changed-82d68894-8f3b-4d74-8f89-efe26013d5be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:14:30 np0005474864 nova_compute[192593]: 2025-10-07 20:14:30.830 2 DEBUG oslo_concurrency.lockutils [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.253 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:14:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.034 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.273 2 DEBUG nova.network.neutron [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updating instance_info_cache with network_info: [{"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.310 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Releasing lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.311 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Instance network_info: |[{"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.312 2 DEBUG oslo_concurrency.lockutils [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.312 2 DEBUG nova.network.neutron [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Refreshing network info cache for port 82d68894-8f3b-4d74-8f89-efe26013d5be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.317 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Start _get_guest_xml network_info=[{"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.326 2 WARNING nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.332 2 DEBUG nova.virt.libvirt.host [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.333 2 DEBUG nova.virt.libvirt.host [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.346 2 DEBUG nova.virt.libvirt.host [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.347 2 DEBUG nova.virt.libvirt.host [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.349 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.349 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.349 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.350 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.350 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.350 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.351 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.351 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.352 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.352 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.352 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.353 2 DEBUG nova.virt.hardware [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.357 2 DEBUG nova.virt.libvirt.vif [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-916734331',display_name='tempest-TestNetworkAdvancedServerOps-server-916734331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-916734331',id=21,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjK9C4E8bflG16gJB9wwBClk5jcuXQVv3We2+BCOLjSNX7yYBZtu1brRwsPtQfOxw6VpHDk7NqDL7enVTxwMx/7GPVdFOQaYGcupsXO8r3ahMk0w7jvpDNeupZh1J8veQ==',key_name='tempest-TestNetworkAdvancedServerOps-1365789110',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-t4o04fv6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:14:24Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=286b3745-27da-4e69-9c3b-cfbe4a28e2e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.357 2 DEBUG nova.network.os_vif_util [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.358 2 DEBUG nova.network.os_vif_util [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:a2:a6,bridge_name='br-int',has_traffic_filtering=True,id=82d68894-8f3b-4d74-8f89-efe26013d5be,network=Network(efa8245e-71d2-4349-9429-c565da73214b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d68894-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.359 2 DEBUG nova.objects.instance [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'pci_devices' on Instance uuid 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.380 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <uuid>286b3745-27da-4e69-9c3b-cfbe4a28e2e9</uuid>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <name>instance-00000015</name>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-916734331</nova:name>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:14:33</nova:creationTime>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:user uuid="db22b0e0f6594362af24484ba9b01936">tempest-TestNetworkAdvancedServerOps-585003851-project-member</nova:user>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:project uuid="8a545a398e2e433bbe3f3dfa2ec4ebcb">tempest-TestNetworkAdvancedServerOps-585003851</nova:project>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        <nova:port uuid="82d68894-8f3b-4d74-8f89-efe26013d5be">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <entry name="serial">286b3745-27da-4e69-9c3b-cfbe4a28e2e9</entry>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <entry name="uuid">286b3745-27da-4e69-9c3b-cfbe4a28e2e9</entry>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk.config"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:38:a2:a6"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <target dev="tap82d68894-8f"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/console.log" append="off"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:14:33 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:14:33 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:14:33 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:14:33 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.382 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Preparing to wait for external event network-vif-plugged-82d68894-8f3b-4d74-8f89-efe26013d5be prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.382 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.383 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.383 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.384 2 DEBUG nova.virt.libvirt.vif [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-916734331',display_name='tempest-TestNetworkAdvancedServerOps-server-916734331',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-916734331',id=21,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjK9C4E8bflG16gJB9wwBClk5jcuXQVv3We2+BCOLjSNX7yYBZtu1brRwsPtQfOxw6VpHDk7NqDL7enVTxwMx/7GPVdFOQaYGcupsXO8r3ahMk0w7jvpDNeupZh1J8veQ==',key_name='tempest-TestNetworkAdvancedServerOps-1365789110',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-t4o04fv6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:14:24Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=286b3745-27da-4e69-9c3b-cfbe4a28e2e9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.384 2 DEBUG nova.network.os_vif_util [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.385 2 DEBUG nova.network.os_vif_util [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:38:a2:a6,bridge_name='br-int',has_traffic_filtering=True,id=82d68894-8f3b-4d74-8f89-efe26013d5be,network=Network(efa8245e-71d2-4349-9429-c565da73214b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d68894-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.385 2 DEBUG os_vif [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:a2:a6,bridge_name='br-int',has_traffic_filtering=True,id=82d68894-8f3b-4d74-8f89-efe26013d5be,network=Network(efa8245e-71d2-4349-9429-c565da73214b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d68894-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.386 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.386 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.387 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.390 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82d68894-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.391 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82d68894-8f, col_values=(('external_ids', {'iface-id': '82d68894-8f3b-4d74-8f89-efe26013d5be', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:38:a2:a6', 'vm-uuid': '286b3745-27da-4e69-9c3b-cfbe4a28e2e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.395 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:14:33 np0005474864 NetworkManager[51631]: <info>  [1759868073.3954] manager: (tap82d68894-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.405 2 INFO os_vif [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:38:a2:a6,bridge_name='br-int',has_traffic_filtering=True,id=82d68894-8f3b-4d74-8f89-efe26013d5be,network=Network(efa8245e-71d2-4349-9429-c565da73214b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d68894-8f')#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.470 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.472 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.473 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No VIF found with MAC fa:16:3e:38:a2:a6, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:14:33 np0005474864 nova_compute[192593]: 2025-10-07 20:14:33.474 2 INFO nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Using config drive#033[00m
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.242 2 INFO nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Creating config drive at /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk.config#033[00m
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.251 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbynt8cbh execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.395 2 DEBUG oslo_concurrency.processutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpbynt8cbh" returned: 0 in 0.144s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:14:34 np0005474864 kernel: tap82d68894-8f: entered promiscuous mode
Oct  7 16:14:34 np0005474864 NetworkManager[51631]: <info>  [1759868074.5079] manager: (tap82d68894-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.521 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:34Z|00119|binding|INFO|Claiming lport 82d68894-8f3b-4d74-8f89-efe26013d5be for this chassis.
Oct  7 16:14:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:34Z|00120|binding|INFO|82d68894-8f3b-4d74-8f89-efe26013d5be: Claiming fa:16:3e:38:a2:a6 10.100.0.13
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.537 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:a2:a6 10.100.0.13'], port_security=['fa:16:3e:38:a2:a6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '286b3745-27da-4e69-9c3b-cfbe4a28e2e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efa8245e-71d2-4349-9429-c565da73214b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cb6e2505-3ee0-4650-ae13-5b442a965a4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7db772e6-5b77-4d0a-956c-e593c66e6c04, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=82d68894-8f3b-4d74-8f89-efe26013d5be) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.540 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 82d68894-8f3b-4d74-8f89-efe26013d5be in datapath efa8245e-71d2-4349-9429-c565da73214b bound to our chassis#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.542 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network efa8245e-71d2-4349-9429-c565da73214b#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.560 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ed14ef9b-3fe6-4bc6-b6a9-36dabb06d8c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.561 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapefa8245e-71 in ovnmeta-efa8245e-71d2-4349-9429-c565da73214b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.564 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapefa8245e-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.564 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[705c0545-ba47-45c7-8bf9-10783525da54]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.565 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c3b8c773-625a-45c0-bdf3-bebcbfb20fae]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.582 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[944518a3-5672-41ee-b03f-ea2311ce7935]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 systemd-machined[152586]: New machine qemu-7-instance-00000015.
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.595 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:34Z|00121|binding|INFO|Setting lport 82d68894-8f3b-4d74-8f89-efe26013d5be ovn-installed in OVS
Oct  7 16:14:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:34Z|00122|binding|INFO|Setting lport 82d68894-8f3b-4d74-8f89-efe26013d5be up in Southbound
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:34 np0005474864 systemd[1]: Started Virtual Machine qemu-7-instance-00000015.
Oct  7 16:14:34 np0005474864 systemd-udevd[223713]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.604 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[adbe4a13-afe7-4d80-b5c4-d556d2b3fad3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 podman[223650]: 2025-10-07 20:14:34.606732102 +0000 UTC m=+0.107817805 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:14:34 np0005474864 NetworkManager[51631]: <info>  [1759868074.6210] device (tap82d68894-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:14:34 np0005474864 NetworkManager[51631]: <info>  [1759868074.6218] device (tap82d68894-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:14:34 np0005474864 podman[223648]: 2025-10-07 20:14:34.622313209 +0000 UTC m=+0.142202371 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_managed=true, container_name=iscsid, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:14:34 np0005474864 podman[223649]: 2025-10-07 20:14:34.640337816 +0000 UTC m=+0.157939072 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true)
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.644 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[06978d31-2074-4eaa-a78a-de350b442d28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.650 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2a249578-0113-44c5-b1d6-0d656b858267]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 NetworkManager[51631]: <info>  [1759868074.6517] manager: (tapefa8245e-70): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.684 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[3aaca862-9010-45f1-8acb-e04eb21f7085]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.688 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[55fbe44a-fd41-482c-8332-2d7267e11eed]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 NetworkManager[51631]: <info>  [1759868074.7135] device (tapefa8245e-70): carrier: link connected
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.719 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[e13d37d5-058d-47f0-a032-58cff875687c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.739 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8e65d2b3-bf64-4c5f-b0a6-af065bd3187b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefa8245e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:39:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371060, 'reachable_time': 23932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223750, 'error': None, 'target': 'ovnmeta-efa8245e-71d2-4349-9429-c565da73214b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.758 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[387e27a7-0e50-483e-b0fb-79a3aff56572]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5e:3908'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371060, 'tstamp': 371060}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223751, 'error': None, 'target': 'ovnmeta-efa8245e-71d2-4349-9429-c565da73214b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.780 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[703e5153-2c04-4009-baa7-b6c7483707c5]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapefa8245e-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5e:39:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 37], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371060, 'reachable_time': 23932, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223752, 'error': None, 'target': 'ovnmeta-efa8245e-71d2-4349-9429-c565da73214b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.822 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ca951430-6874-49e6-a5b3-0d80e1b3fa93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.906 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e15220f9-b128-49a5-8f0b-fe78b8c5093d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.908 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefa8245e-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.908 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.909 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapefa8245e-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:34 np0005474864 NetworkManager[51631]: <info>  [1759868074.9130] manager: (tapefa8245e-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Oct  7 16:14:34 np0005474864 kernel: tapefa8245e-70: entered promiscuous mode
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.921 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapefa8245e-70, col_values=(('external_ids', {'iface-id': '21a069dc-d366-4cbd-972c-189634d3700f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:34Z|00123|binding|INFO|Releasing lport 21a069dc-d366-4cbd-972c-189634d3700f from this chassis (sb_readonly=0)
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.925 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.927 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/efa8245e-71d2-4349-9429-c565da73214b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/efa8245e-71d2-4349-9429-c565da73214b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.928 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[aaabf073-e24a-4712-b2f1-3e6ce0ee5ebc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.929 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-efa8245e-71d2-4349-9429-c565da73214b
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/efa8245e-71d2-4349-9429-c565da73214b.pid.haproxy
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID efa8245e-71d2-4349-9429-c565da73214b
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:14:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:34.930 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-efa8245e-71d2-4349-9429-c565da73214b', 'env', 'PROCESS_TAG=haproxy-efa8245e-71d2-4349-9429-c565da73214b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/efa8245e-71d2-4349-9429-c565da73214b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:14:34 np0005474864 nova_compute[192593]: 2025-10-07 20:14:34.948 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.149 2 DEBUG nova.compute.manager [req-5b261c70-3926-469b-838b-25e52aba4c45 req-6ce18d76-8787-47f3-942f-11d5cefcab34 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Received event network-vif-plugged-82d68894-8f3b-4d74-8f89-efe26013d5be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.150 2 DEBUG oslo_concurrency.lockutils [req-5b261c70-3926-469b-838b-25e52aba4c45 req-6ce18d76-8787-47f3-942f-11d5cefcab34 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.150 2 DEBUG oslo_concurrency.lockutils [req-5b261c70-3926-469b-838b-25e52aba4c45 req-6ce18d76-8787-47f3-942f-11d5cefcab34 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.150 2 DEBUG oslo_concurrency.lockutils [req-5b261c70-3926-469b-838b-25e52aba4c45 req-6ce18d76-8787-47f3-942f-11d5cefcab34 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.150 2 DEBUG nova.compute.manager [req-5b261c70-3926-469b-838b-25e52aba4c45 req-6ce18d76-8787-47f3-942f-11d5cefcab34 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Processing event network-vif-plugged-82d68894-8f3b-4d74-8f89-efe26013d5be _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:14:35 np0005474864 podman[223791]: 2025-10-07 20:14:35.370359711 +0000 UTC m=+0.077876315 container create deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:14:35 np0005474864 systemd[1]: Started libpod-conmon-deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795.scope.
Oct  7 16:14:35 np0005474864 podman[223791]: 2025-10-07 20:14:35.329627883 +0000 UTC m=+0.037144537 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.426 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:14:35 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.428 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868075.426053, 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.428 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] VM Started (Lifecycle Event)#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.432 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:14:35 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0350a589988e71b6832eda819e040d32ef7b2fe4b9665c40b9fef894dfef2b7b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.436 2 INFO nova.virt.libvirt.driver [-] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Instance spawned successfully.#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.437 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:14:35 np0005474864 podman[223791]: 2025-10-07 20:14:35.451934702 +0000 UTC m=+0.159451286 container init deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.452 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.463 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:14:35 np0005474864 podman[223791]: 2025-10-07 20:14:35.468230249 +0000 UTC m=+0.175746823 container start deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.469 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.469 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.470 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.471 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.472 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.473 2 DEBUG nova.virt.libvirt.driver [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.482 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.483 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868075.4314337, 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.483 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:14:35 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [NOTICE]   (223810) : New worker (223812) forked
Oct  7 16:14:35 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [NOTICE]   (223810) : Loading success.
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.515 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.519 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868075.4318554, 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.520 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.538 2 INFO nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Took 11.19 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.538 2 DEBUG nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.549 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.553 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.581 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.607 2 INFO nova.compute.manager [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Took 11.76 seconds to build instance.#033[00m
Oct  7 16:14:35 np0005474864 nova_compute[192593]: 2025-10-07 20:14:35.627 2 DEBUG oslo_concurrency.lockutils [None req-55a864a5-9843-44fd-9a61-4f403a8e83b1 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.883s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:36 np0005474864 nova_compute[192593]: 2025-10-07 20:14:36.351 2 DEBUG nova.network.neutron [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updated VIF entry in instance network info cache for port 82d68894-8f3b-4d74-8f89-efe26013d5be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:14:36 np0005474864 nova_compute[192593]: 2025-10-07 20:14:36.352 2 DEBUG nova.network.neutron [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updating instance_info_cache with network_info: [{"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:14:36 np0005474864 nova_compute[192593]: 2025-10-07 20:14:36.377 2 DEBUG oslo_concurrency.lockutils [req-88f19f13-eaff-414a-a9a3-f2338882084c req-0487630f-6d97-40bf-ab17-4e06a5b00e77 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:14:37 np0005474864 nova_compute[192593]: 2025-10-07 20:14:37.250 2 DEBUG nova.compute.manager [req-68b69914-6bcf-4f8b-aca6-f427228c9196 req-abc30aea-16b3-42df-9f98-9a7bcc980fe7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Received event network-vif-plugged-82d68894-8f3b-4d74-8f89-efe26013d5be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:37 np0005474864 nova_compute[192593]: 2025-10-07 20:14:37.251 2 DEBUG oslo_concurrency.lockutils [req-68b69914-6bcf-4f8b-aca6-f427228c9196 req-abc30aea-16b3-42df-9f98-9a7bcc980fe7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:14:37 np0005474864 nova_compute[192593]: 2025-10-07 20:14:37.251 2 DEBUG oslo_concurrency.lockutils [req-68b69914-6bcf-4f8b-aca6-f427228c9196 req-abc30aea-16b3-42df-9f98-9a7bcc980fe7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:14:37 np0005474864 nova_compute[192593]: 2025-10-07 20:14:37.252 2 DEBUG oslo_concurrency.lockutils [req-68b69914-6bcf-4f8b-aca6-f427228c9196 req-abc30aea-16b3-42df-9f98-9a7bcc980fe7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:14:37 np0005474864 nova_compute[192593]: 2025-10-07 20:14:37.252 2 DEBUG nova.compute.manager [req-68b69914-6bcf-4f8b-aca6-f427228c9196 req-abc30aea-16b3-42df-9f98-9a7bcc980fe7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] No waiting events found dispatching network-vif-plugged-82d68894-8f3b-4d74-8f89-efe26013d5be pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:14:37 np0005474864 nova_compute[192593]: 2025-10-07 20:14:37.253 2 WARNING nova.compute.manager [req-68b69914-6bcf-4f8b-aca6-f427228c9196 req-abc30aea-16b3-42df-9f98-9a7bcc980fe7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Received unexpected event network-vif-plugged-82d68894-8f3b-4d74-8f89-efe26013d5be for instance with vm_state active and task_state None.#033[00m
Oct  7 16:14:38 np0005474864 nova_compute[192593]: 2025-10-07 20:14:38.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:38 np0005474864 nova_compute[192593]: 2025-10-07 20:14:38.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:40 np0005474864 podman[223821]: 2025-10-07 20:14:40.380088196 +0000 UTC m=+0.063960496 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2)
Oct  7 16:14:41 np0005474864 NetworkManager[51631]: <info>  [1759868081.3325] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Oct  7 16:14:41 np0005474864 NetworkManager[51631]: <info>  [1759868081.3335] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/68)
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:41 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:41Z|00124|binding|INFO|Releasing lport 21a069dc-d366-4cbd-972c-189634d3700f from this chassis (sb_readonly=0)
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.526 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.633 2 DEBUG nova.compute.manager [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Received event network-changed-82d68894-8f3b-4d74-8f89-efe26013d5be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.634 2 DEBUG nova.compute.manager [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Refreshing instance network info cache due to event network-changed-82d68894-8f3b-4d74-8f89-efe26013d5be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.634 2 DEBUG oslo_concurrency.lockutils [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.635 2 DEBUG oslo_concurrency.lockutils [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:14:41 np0005474864 nova_compute[192593]: 2025-10-07 20:14:41.635 2 DEBUG nova.network.neutron [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Refreshing network info cache for port 82d68894-8f3b-4d74-8f89-efe26013d5be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:14:43 np0005474864 nova_compute[192593]: 2025-10-07 20:14:43.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:43 np0005474864 nova_compute[192593]: 2025-10-07 20:14:43.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:43 np0005474864 podman[223840]: 2025-10-07 20:14:43.41464333 +0000 UTC m=+0.099839435 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:14:44 np0005474864 nova_compute[192593]: 2025-10-07 20:14:44.551 2 DEBUG nova.network.neutron [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updated VIF entry in instance network info cache for port 82d68894-8f3b-4d74-8f89-efe26013d5be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:14:44 np0005474864 nova_compute[192593]: 2025-10-07 20:14:44.553 2 DEBUG nova.network.neutron [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updating instance_info_cache with network_info: [{"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:14:44 np0005474864 nova_compute[192593]: 2025-10-07 20:14:44.592 2 DEBUG oslo_concurrency.lockutils [req-074b4da2-cad6-4578-982d-34810a9eabe6 req-c54ba44d-9912-45db-ac65-6a08bb732d52 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:14:46 np0005474864 podman[223884]: 2025-10-07 20:14:46.737125345 +0000 UTC m=+0.098272600 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:14:47 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:47Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:38:a2:a6 10.100.0.13
Oct  7 16:14:47 np0005474864 ovn_controller[94801]: 2025-10-07T20:14:47Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:38:a2:a6 10.100.0.13
Oct  7 16:14:48 np0005474864 nova_compute[192593]: 2025-10-07 20:14:48.043 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:48 np0005474864 nova_compute[192593]: 2025-10-07 20:14:48.272 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:48 np0005474864 nova_compute[192593]: 2025-10-07 20:14:48.399 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:49 np0005474864 nova_compute[192593]: 2025-10-07 20:14:49.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:49 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:49.941 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:14:49 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:49.943 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:14:49 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:14:49.945 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:14:53 np0005474864 nova_compute[192593]: 2025-10-07 20:14:53.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:53 np0005474864 nova_compute[192593]: 2025-10-07 20:14:53.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:53 np0005474864 nova_compute[192593]: 2025-10-07 20:14:53.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:54 np0005474864 nova_compute[192593]: 2025-10-07 20:14:54.117 2 INFO nova.compute.manager [None req-cd77840b-9af7-40af-b4d8-89adbf4765ef db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Get console output#033[00m
Oct  7 16:14:54 np0005474864 nova_compute[192593]: 2025-10-07 20:14:54.124 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:14:54 np0005474864 nova_compute[192593]: 2025-10-07 20:14:54.957 2 INFO nova.compute.manager [None req-309ed55e-9482-4d80-9a07-95b7a607d9eb db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Pausing#033[00m
Oct  7 16:14:54 np0005474864 nova_compute[192593]: 2025-10-07 20:14:54.959 2 DEBUG nova.objects.instance [None req-309ed55e-9482-4d80-9a07-95b7a607d9eb db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'flavor' on Instance uuid 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:14:54 np0005474864 nova_compute[192593]: 2025-10-07 20:14:54.995 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868094.9953017, 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:14:54 np0005474864 nova_compute[192593]: 2025-10-07 20:14:54.996 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:14:55 np0005474864 nova_compute[192593]: 2025-10-07 20:14:55.000 2 DEBUG nova.compute.manager [None req-309ed55e-9482-4d80-9a07-95b7a607d9eb db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:55 np0005474864 nova_compute[192593]: 2025-10-07 20:14:55.016 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:55 np0005474864 nova_compute[192593]: 2025-10-07 20:14:55.022 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:14:55 np0005474864 nova_compute[192593]: 2025-10-07 20:14:55.063 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.248 2 INFO nova.compute.manager [None req-a75616c8-0cd8-4996-bd02-a0b714aac6c9 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Get console output#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.254 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.423 2 INFO nova.compute.manager [None req-e82bbb25-8ea3-4242-906c-06cf4e44040f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Unpausing#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.426 2 DEBUG nova.objects.instance [None req-e82bbb25-8ea3-4242-906c-06cf4e44040f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'flavor' on Instance uuid 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.455 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868098.45501, 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.456 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:14:58 np0005474864 virtqemud[192092]: argument unsupported: QEMU guest agent is not configured
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.463 2 DEBUG nova.virt.libvirt.guest [None req-e82bbb25-8ea3-4242-906c-06cf4e44040f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.463 2 DEBUG nova.compute.manager [None req-e82bbb25-8ea3-4242-906c-06cf4e44040f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.479 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.485 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:14:58 np0005474864 nova_compute[192593]: 2025-10-07 20:14:58.518 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Oct  7 16:15:00 np0005474864 podman[223906]: 2025-10-07 20:15:00.381648409 +0000 UTC m=+0.067698913 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:15:00 np0005474864 podman[223907]: 2025-10-07 20:15:00.383940255 +0000 UTC m=+0.073384636 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, architecture=x86_64, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:15:01 np0005474864 nova_compute[192593]: 2025-10-07 20:15:01.514 2 INFO nova.compute.manager [None req-780ae357-b132-4beb-af2c-dd78edcc7ddd db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Get console output#033[00m
Oct  7 16:15:01 np0005474864 nova_compute[192593]: 2025-10-07 20:15:01.524 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:15:01 np0005474864 nova_compute[192593]: 2025-10-07 20:15:01.918 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:02 np0005474864 nova_compute[192593]: 2025-10-07 20:15:02.975 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:02 np0005474864 nova_compute[192593]: 2025-10-07 20:15:02.976 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:02 np0005474864 nova_compute[192593]: 2025-10-07 20:15:02.977 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:02 np0005474864 nova_compute[192593]: 2025-10-07 20:15:02.977 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:02 np0005474864 nova_compute[192593]: 2025-10-07 20:15:02.978 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:02 np0005474864 nova_compute[192593]: 2025-10-07 20:15:02.980 2 INFO nova.compute.manager [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Terminating instance#033[00m
Oct  7 16:15:02 np0005474864 nova_compute[192593]: 2025-10-07 20:15:02.982 2 DEBUG nova.compute.manager [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:15:03 np0005474864 kernel: tap82d68894-8f (unregistering): left promiscuous mode
Oct  7 16:15:03 np0005474864 NetworkManager[51631]: <info>  [1759868103.0246] device (tap82d68894-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:15:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:03Z|00125|binding|INFO|Releasing lport 82d68894-8f3b-4d74-8f89-efe26013d5be from this chassis (sb_readonly=0)
Oct  7 16:15:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:03Z|00126|binding|INFO|Setting lport 82d68894-8f3b-4d74-8f89-efe26013d5be down in Southbound
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.039 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:03Z|00127|binding|INFO|Removing iface tap82d68894-8f ovn-installed in OVS
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.045 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.052 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:38:a2:a6 10.100.0.13'], port_security=['fa:16:3e:38:a2:a6 10.100.0.13'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28', 'neutron:device_id': '286b3745-27da-4e69-9c3b-cfbe4a28e2e9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-efa8245e-71d2-4349-9429-c565da73214b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cb6e2505-3ee0-4650-ae13-5b442a965a4b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7db772e6-5b77-4d0a-956c-e593c66e6c04, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=82d68894-8f3b-4d74-8f89-efe26013d5be) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.055 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 82d68894-8f3b-4d74-8f89-efe26013d5be in datapath efa8245e-71d2-4349-9429-c565da73214b unbound from our chassis#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.059 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network efa8245e-71d2-4349-9429-c565da73214b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.061 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c2e41f1d-8436-4931-95d9-32d310859394]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.063 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-efa8245e-71d2-4349-9429-c565da73214b namespace which is not needed anymore#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.085 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000015.scope: Deactivated successfully.
Oct  7 16:15:03 np0005474864 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d00000015.scope: Consumed 13.225s CPU time.
Oct  7 16:15:03 np0005474864 systemd-machined[152586]: Machine qemu-7-instance-00000015 terminated.
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.212 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.264 2 INFO nova.virt.libvirt.driver [-] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Instance destroyed successfully.#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.265 2 DEBUG nova.objects.instance [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'resources' on Instance uuid 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:15:03 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [NOTICE]   (223810) : haproxy version is 2.8.14-c23fe91
Oct  7 16:15:03 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [NOTICE]   (223810) : path to executable is /usr/sbin/haproxy
Oct  7 16:15:03 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [WARNING]  (223810) : Exiting Master process...
Oct  7 16:15:03 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [WARNING]  (223810) : Exiting Master process...
Oct  7 16:15:03 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [ALERT]    (223810) : Current worker (223812) exited with code 143 (Terminated)
Oct  7 16:15:03 np0005474864 neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b[223806]: [WARNING]  (223810) : All workers exited. Exiting... (0)
Oct  7 16:15:03 np0005474864 systemd[1]: libpod-deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795.scope: Deactivated successfully.
Oct  7 16:15:03 np0005474864 podman[223976]: 2025-10-07 20:15:03.28890404 +0000 UTC m=+0.070794982 container died deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.290 2 DEBUG nova.virt.libvirt.vif [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:14:22Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-916734331',display_name='tempest-TestNetworkAdvancedServerOps-server-916734331',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-916734331',id=21,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNjK9C4E8bflG16gJB9wwBClk5jcuXQVv3We2+BCOLjSNX7yYBZtu1brRwsPtQfOxw6VpHDk7NqDL7enVTxwMx/7GPVdFOQaYGcupsXO8r3ahMk0w7jvpDNeupZh1J8veQ==',key_name='tempest-TestNetworkAdvancedServerOps-1365789110',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:14:35Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-t4o04fv6',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:14:58Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=286b3745-27da-4e69-9c3b-cfbe4a28e2e9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.290 2 DEBUG nova.network.os_vif_util [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.291 2 DEBUG nova.network.os_vif_util [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:38:a2:a6,bridge_name='br-int',has_traffic_filtering=True,id=82d68894-8f3b-4d74-8f89-efe26013d5be,network=Network(efa8245e-71d2-4349-9429-c565da73214b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d68894-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.291 2 DEBUG os_vif [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:a2:a6,bridge_name='br-int',has_traffic_filtering=True,id=82d68894-8f3b-4d74-8f89-efe26013d5be,network=Network(efa8245e-71d2-4349-9429-c565da73214b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d68894-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.294 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82d68894-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.299 2 INFO os_vif [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:38:a2:a6,bridge_name='br-int',has_traffic_filtering=True,id=82d68894-8f3b-4d74-8f89-efe26013d5be,network=Network(efa8245e-71d2-4349-9429-c565da73214b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82d68894-8f')#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.300 2 INFO nova.virt.libvirt.driver [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Deleting instance files /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9_del#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.301 2 INFO nova.virt.libvirt.driver [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Deletion of /var/lib/nova/instances/286b3745-27da-4e69-9c3b-cfbe4a28e2e9_del complete#033[00m
Oct  7 16:15:03 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795-userdata-shm.mount: Deactivated successfully.
Oct  7 16:15:03 np0005474864 systemd[1]: var-lib-containers-storage-overlay-0350a589988e71b6832eda819e040d32ef7b2fe4b9665c40b9fef894dfef2b7b-merged.mount: Deactivated successfully.
Oct  7 16:15:03 np0005474864 podman[223976]: 2025-10-07 20:15:03.331327028 +0000 UTC m=+0.113217940 container cleanup deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:15:03 np0005474864 systemd[1]: libpod-conmon-deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795.scope: Deactivated successfully.
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.358 2 INFO nova.compute.manager [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.359 2 DEBUG oslo.service.loopingcall [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.359 2 DEBUG nova.compute.manager [-] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.360 2 DEBUG nova.network.neutron [-] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:15:03 np0005474864 podman[224019]: 2025-10-07 20:15:03.401590204 +0000 UTC m=+0.045670052 container remove deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.408 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8afe2040-d42c-42ca-a969-4fc9f2cecdbb]: (4, ('Tue Oct  7 08:15:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b (deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795)\ndeee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795\nTue Oct  7 08:15:03 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-efa8245e-71d2-4349-9429-c565da73214b (deee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795)\ndeee14105802b641b2458bc9200f8c97ab060a7d530ccf413749a4ce08aa6795\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.410 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[df7b3652-f15c-49ba-b4c9-ae46e8e6e75d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.411 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapefa8245e-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 kernel: tapefa8245e-70: left promiscuous mode
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.439 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[15e25143-beb9-4ac2-bb20-f95a17572804]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.471 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e69f306c-63bf-497a-a00d-f038abeda00d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.473 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cad1d5e6-81c3-454f-bd4f-30a9e0483edd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.492 2 DEBUG nova.compute.manager [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Received event network-changed-82d68894-8f3b-4d74-8f89-efe26013d5be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.493 2 DEBUG nova.compute.manager [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Refreshing instance network info cache due to event network-changed-82d68894-8f3b-4d74-8f89-efe26013d5be. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.493 2 DEBUG oslo_concurrency.lockutils [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.493 2 DEBUG oslo_concurrency.lockutils [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:15:03 np0005474864 nova_compute[192593]: 2025-10-07 20:15:03.494 2 DEBUG nova.network.neutron [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Refreshing network info cache for port 82d68894-8f3b-4d74-8f89-efe26013d5be _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.499 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1f345fb7-8b5c-4247-8684-5499d92e0b0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371053, 'reachable_time': 21080, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224034, 'error': None, 'target': 'ovnmeta-efa8245e-71d2-4349-9429-c565da73214b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:03 np0005474864 systemd[1]: run-netns-ovnmeta\x2defa8245e\x2d71d2\x2d4349\x2d9429\x2dc565da73214b.mount: Deactivated successfully.
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.505 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-efa8245e-71d2-4349-9429-c565da73214b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:15:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:03.505 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[5a733e4b-0866-4482-99c8-51a6d9f31fde]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:04 np0005474864 nova_compute[192593]: 2025-10-07 20:15:04.983 2 DEBUG nova.network.neutron [-] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.097 2 INFO nova.compute.manager [-] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Took 1.74 seconds to deallocate network for instance.#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.192 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.193 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:05.223 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:17:08 2001:db8:0:1:f816:3eff:fe58:1708 2001:db8::f816:3eff:fe58:1708'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe58:1708/64 2001:db8::f816:3eff:fe58:1708/64', 'neutron:device_id': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fb1d4ce-b691-4091-872a-86df16b02e47, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3fba40e3-39d9-4871-b9b9-3e2e5088af4f) old=Port_Binding(mac=['fa:16:3e:58:17:08 2001:db8::f816:3eff:fe58:1708'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe58:1708/64', 'neutron:device_id': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:15:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:05.226 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3fba40e3-39d9-4871-b9b9-3e2e5088af4f in datapath 99465e0c-6ee8-477a-94aa-ab737f76f9e4 updated#033[00m
Oct  7 16:15:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:05.229 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99465e0c-6ee8-477a-94aa-ab737f76f9e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:15:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:05.230 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cf1dae38-35ef-41df-a10b-2f9d836e40d7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.247 2 DEBUG nova.compute.provider_tree [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.260 2 DEBUG nova.scheduler.client.report [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.295 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.332 2 INFO nova.scheduler.client.report [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Deleted allocations for instance 286b3745-27da-4e69-9c3b-cfbe4a28e2e9#033[00m
Oct  7 16:15:05 np0005474864 podman[224037]: 2025-10-07 20:15:05.378137421 +0000 UTC m=+0.069996894 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.vendor=CentOS)
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.388 2 DEBUG oslo_concurrency.lockutils [None req-ee04e580-9451-4364-bdc9-7501487ef227 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "286b3745-27da-4e69-9c3b-cfbe4a28e2e9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.412s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:05 np0005474864 podman[224035]: 2025-10-07 20:15:05.391070411 +0000 UTC m=+0.084273502 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid)
Oct  7 16:15:05 np0005474864 podman[224036]: 2025-10-07 20:15:05.451371297 +0000 UTC m=+0.146385610 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.600 2 DEBUG nova.compute.manager [req-9ab310a8-eb76-4943-8c98-4f21838fc775 req-05793c66-e22e-461f-97fd-8eaf2bc4a3a1 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Received event network-vif-deleted-82d68894-8f3b-4d74-8f89-efe26013d5be external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.644 2 DEBUG nova.network.neutron [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updated VIF entry in instance network info cache for port 82d68894-8f3b-4d74-8f89-efe26013d5be. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.645 2 DEBUG nova.network.neutron [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Updating instance_info_cache with network_info: [{"id": "82d68894-8f3b-4d74-8f89-efe26013d5be", "address": "fa:16:3e:38:a2:a6", "network": {"id": "efa8245e-71d2-4349-9429-c565da73214b", "bridge": "br-int", "label": "tempest-network-smoke--1789304152", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82d68894-8f", "ovs_interfaceid": "82d68894-8f3b-4d74-8f89-efe26013d5be", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:15:05 np0005474864 nova_compute[192593]: 2025-10-07 20:15:05.670 2 DEBUG oslo_concurrency.lockutils [req-479168f1-783e-42a0-bdd0-065cb613151b req-3515b160-8618-4cde-8a37-837a60b622ba 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-286b3745-27da-4e69-9c3b-cfbe4a28e2e9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:15:08 np0005474864 nova_compute[192593]: 2025-10-07 20:15:08.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:08 np0005474864 nova_compute[192593]: 2025-10-07 20:15:08.296 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:11 np0005474864 podman[224096]: 2025-10-07 20:15:11.400285175 +0000 UTC m=+0.090915132 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent)
Oct  7 16:15:12 np0005474864 nova_compute[192593]: 2025-10-07 20:15:12.131 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:12 np0005474864 nova_compute[192593]: 2025-10-07 20:15:12.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:13 np0005474864 nova_compute[192593]: 2025-10-07 20:15:13.089 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:13 np0005474864 nova_compute[192593]: 2025-10-07 20:15:13.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:14 np0005474864 podman[224118]: 2025-10-07 20:15:14.392713557 +0000 UTC m=+0.074027979 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:15:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:16.188 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:16.189 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:16.189 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:16 np0005474864 nova_compute[192593]: 2025-10-07 20:15:16.904 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:16 np0005474864 nova_compute[192593]: 2025-10-07 20:15:16.905 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:16 np0005474864 nova_compute[192593]: 2025-10-07 20:15:16.928 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.030 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.031 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.041 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.042 2 INFO nova.compute.claims [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.119 2 DEBUG nova.scheduler.client.report [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Refreshing inventories for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.148 2 DEBUG nova.scheduler.client.report [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Updating ProviderTree inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.148 2 DEBUG nova.compute.provider_tree [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.181 2 DEBUG nova.scheduler.client.report [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Refreshing aggregate associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.217 2 DEBUG nova.scheduler.client.report [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Refreshing trait associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.274 2 DEBUG nova.compute.provider_tree [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.301 2 DEBUG nova.scheduler.client.report [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.330 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.299s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.331 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.386 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.387 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:15:17 np0005474864 podman[224142]: 2025-10-07 20:15:17.393771127 +0000 UTC m=+0.086235428 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.416 2 INFO nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.433 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.516 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.518 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.519 2 INFO nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Creating image(s)#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.520 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "/var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.521 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.522 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.545 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.602 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.604 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.605 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.630 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.649 2 DEBUG nova.policy [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.685 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.686 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.722 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.724 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.119s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.725 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.783 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.784 2 DEBUG nova.virt.disk.api [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Checking if we can resize image /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.785 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.841 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.842 2 DEBUG nova.virt.disk.api [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Cannot resize image /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.843 2 DEBUG nova.objects.instance [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'migration_context' on Instance uuid c491b943-fbbd-46e0-be8c-74a8c1378ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.862 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.862 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Ensure instance console log exists: /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.863 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.864 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:17 np0005474864 nova_compute[192593]: 2025-10-07 20:15:17.864 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:18 np0005474864 nova_compute[192593]: 2025-10-07 20:15:18.092 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:18 np0005474864 nova_compute[192593]: 2025-10-07 20:15:18.262 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868103.260295, 286b3745-27da-4e69-9c3b-cfbe4a28e2e9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:15:18 np0005474864 nova_compute[192593]: 2025-10-07 20:15:18.263 2 INFO nova.compute.manager [-] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:15:18 np0005474864 nova_compute[192593]: 2025-10-07 20:15:18.288 2 DEBUG nova.compute.manager [None req-91295134-5487-41ab-9f75-15f394250276 - - - - - -] [instance: 286b3745-27da-4e69-9c3b-cfbe4a28e2e9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:15:18 np0005474864 nova_compute[192593]: 2025-10-07 20:15:18.301 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:18 np0005474864 nova_compute[192593]: 2025-10-07 20:15:18.507 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Successfully created port: c1d00195-4d32-45ac-b745-1a913060f39d _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:15:19 np0005474864 nova_compute[192593]: 2025-10-07 20:15:19.061 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Successfully created port: fb8c9e14-7c02-42cf-9fe9-afc4a7316794 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:15:19 np0005474864 nova_compute[192593]: 2025-10-07 20:15:19.912 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Successfully updated port: c1d00195-4d32-45ac-b745-1a913060f39d _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:15:20 np0005474864 nova_compute[192593]: 2025-10-07 20:15:20.249 2 DEBUG nova.compute.manager [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-changed-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:20 np0005474864 nova_compute[192593]: 2025-10-07 20:15:20.250 2 DEBUG nova.compute.manager [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing instance network info cache due to event network-changed-c1d00195-4d32-45ac-b745-1a913060f39d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:15:20 np0005474864 nova_compute[192593]: 2025-10-07 20:15:20.250 2 DEBUG oslo_concurrency.lockutils [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:15:20 np0005474864 nova_compute[192593]: 2025-10-07 20:15:20.251 2 DEBUG oslo_concurrency.lockutils [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:15:20 np0005474864 nova_compute[192593]: 2025-10-07 20:15:20.252 2 DEBUG nova.network.neutron [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing network info cache for port c1d00195-4d32-45ac-b745-1a913060f39d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.125 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.126 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.126 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.127 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.347 2 DEBUG nova.network.neutron [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.382 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.384 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5777MB free_disk=73.46393203735352GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.384 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.385 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.421 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Successfully updated port: fb8c9e14-7c02-42cf-9fe9-afc4a7316794 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.448 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.477 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance c491b943-fbbd-46e0-be8c-74a8c1378ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.478 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.478 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.522 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.548 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.574 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:15:21 np0005474864 nova_compute[192593]: 2025-10-07 20:15:21.575 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.528 2 DEBUG nova.compute.manager [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-changed-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.528 2 DEBUG nova.compute.manager [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing instance network info cache due to event network-changed-fb8c9e14-7c02-42cf-9fe9-afc4a7316794. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.528 2 DEBUG oslo_concurrency.lockutils [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.657 2 DEBUG nova.network.neutron [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.680 2 DEBUG oslo_concurrency.lockutils [req-61b3ff56-60de-4d7f-95ec-f1fc7de5fe6b req-d8e6615a-3efc-44d5-a50c-21460eb7f82b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.680 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquired lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.680 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:15:22 np0005474864 nova_compute[192593]: 2025-10-07 20:15:22.953 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:15:23 np0005474864 nova_compute[192593]: 2025-10-07 20:15:23.093 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:23 np0005474864 nova_compute[192593]: 2025-10-07 20:15:23.303 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:23 np0005474864 nova_compute[192593]: 2025-10-07 20:15:23.575 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:23 np0005474864 nova_compute[192593]: 2025-10-07 20:15:23.576 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:15:23 np0005474864 nova_compute[192593]: 2025-10-07 20:15:23.577 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:15:23 np0005474864 nova_compute[192593]: 2025-10-07 20:15:23.614 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  7 16:15:23 np0005474864 nova_compute[192593]: 2025-10-07 20:15:23.615 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:15:24 np0005474864 nova_compute[192593]: 2025-10-07 20:15:24.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:24 np0005474864 nova_compute[192593]: 2025-10-07 20:15:24.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:25 np0005474864 nova_compute[192593]: 2025-10-07 20:15:25.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:25 np0005474864 nova_compute[192593]: 2025-10-07 20:15:25.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:25 np0005474864 nova_compute[192593]: 2025-10-07 20:15:25.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:25 np0005474864 nova_compute[192593]: 2025-10-07 20:15:25.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:15:26 np0005474864 nova_compute[192593]: 2025-10-07 20:15:26.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.123 2 DEBUG nova.network.neutron [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updating instance_info_cache with network_info: [{"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.187 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Releasing lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.187 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Instance network_info: |[{"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.189 2 DEBUG oslo_concurrency.lockutils [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.189 2 DEBUG nova.network.neutron [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing network info cache for port fb8c9e14-7c02-42cf-9fe9-afc4a7316794 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.196 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Start _get_guest_xml network_info=[{"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.204 2 WARNING nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.209 2 DEBUG nova.virt.libvirt.host [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.210 2 DEBUG nova.virt.libvirt.host [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.221 2 DEBUG nova.virt.libvirt.host [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.222 2 DEBUG nova.virt.libvirt.host [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.224 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.225 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.225 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.226 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.226 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.227 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.227 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.227 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.228 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.228 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.229 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.229 2 DEBUG nova.virt.hardware [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.234 2 DEBUG nova.virt.libvirt.vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1139550974',display_name='tempest-TestGettingAddress-server-1139550974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1139550974',id=25,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-nrbptih5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:15:17Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=c491b943-fbbd-46e0-be8c-74a8c1378ab3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.235 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.236 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:c8:d3,bridge_name='br-int',has_traffic_filtering=True,id=c1d00195-4d32-45ac-b745-1a913060f39d,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d00195-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.237 2 DEBUG nova.virt.libvirt.vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1139550974',display_name='tempest-TestGettingAddress-server-1139550974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1139550974',id=25,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-nrbptih5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:15:17Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=c491b943-fbbd-46e0-be8c-74a8c1378ab3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.238 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.239 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:aa:7e,bridge_name='br-int',has_traffic_filtering=True,id=fb8c9e14-7c02-42cf-9fe9-afc4a7316794,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8c9e14-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.240 2 DEBUG nova.objects.instance [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid c491b943-fbbd-46e0-be8c-74a8c1378ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.266 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <uuid>c491b943-fbbd-46e0-be8c-74a8c1378ab3</uuid>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <name>instance-00000019</name>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestGettingAddress-server-1139550974</nova:name>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:15:27</nova:creationTime>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:user uuid="334f092941fc46c496c7def76b2cfe18">tempest-TestGettingAddress-626136673-project-member</nova:user>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:project uuid="2f9bf744045540618c9980fd4a7694f5">tempest-TestGettingAddress-626136673</nova:project>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:port uuid="c1d00195-4d32-45ac-b745-1a913060f39d">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        <nova:port uuid="fb8c9e14-7c02-42cf-9fe9-afc4a7316794">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe24:aa7e" ipVersion="6"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe24:aa7e" ipVersion="6"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <entry name="serial">c491b943-fbbd-46e0-be8c-74a8c1378ab3</entry>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <entry name="uuid">c491b943-fbbd-46e0-be8c-74a8c1378ab3</entry>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.config"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:e1:c8:d3"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <target dev="tapc1d00195-4d"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:24:aa:7e"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <target dev="tapfb8c9e14-7c"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/console.log" append="off"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:15:27 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:15:27 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:15:27 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:15:27 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.271 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Preparing to wait for external event network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.272 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.273 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.273 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.274 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Preparing to wait for external event network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.274 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.275 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.275 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.277 2 DEBUG nova.virt.libvirt.vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1139550974',display_name='tempest-TestGettingAddress-server-1139550974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1139550974',id=25,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-nrbptih5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:15:17Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=c491b943-fbbd-46e0-be8c-74a8c1378ab3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.277 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.279 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e1:c8:d3,bridge_name='br-int',has_traffic_filtering=True,id=c1d00195-4d32-45ac-b745-1a913060f39d,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d00195-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.279 2 DEBUG os_vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:c8:d3,bridge_name='br-int',has_traffic_filtering=True,id=c1d00195-4d32-45ac-b745-1a913060f39d,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d00195-4d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.281 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.282 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.287 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc1d00195-4d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.288 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc1d00195-4d, col_values=(('external_ids', {'iface-id': 'c1d00195-4d32-45ac-b745-1a913060f39d', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e1:c8:d3', 'vm-uuid': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 NetworkManager[51631]: <info>  [1759868127.2920] manager: (tapc1d00195-4d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/69)
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.298 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.299 2 INFO os_vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e1:c8:d3,bridge_name='br-int',has_traffic_filtering=True,id=c1d00195-4d32-45ac-b745-1a913060f39d,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d00195-4d')#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.300 2 DEBUG nova.virt.libvirt.vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1139550974',display_name='tempest-TestGettingAddress-server-1139550974',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1139550974',id=25,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-nrbptih5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:15:17Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=c491b943-fbbd-46e0-be8c-74a8c1378ab3,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.301 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.302 2 DEBUG nova.network.os_vif_util [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:24:aa:7e,bridge_name='br-int',has_traffic_filtering=True,id=fb8c9e14-7c02-42cf-9fe9-afc4a7316794,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8c9e14-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.303 2 DEBUG os_vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:aa:7e,bridge_name='br-int',has_traffic_filtering=True,id=fb8c9e14-7c02-42cf-9fe9-afc4a7316794,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8c9e14-7c') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.304 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.305 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.308 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.309 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb8c9e14-7c, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.310 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfb8c9e14-7c, col_values=(('external_ids', {'iface-id': 'fb8c9e14-7c02-42cf-9fe9-afc4a7316794', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:24:aa:7e', 'vm-uuid': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 NetworkManager[51631]: <info>  [1759868127.3141] manager: (tapfb8c9e14-7c): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.325 2 INFO os_vif [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:24:aa:7e,bridge_name='br-int',has_traffic_filtering=True,id=fb8c9e14-7c02-42cf-9fe9-afc4a7316794,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8c9e14-7c')#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.396 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.397 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.398 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:e1:c8:d3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.398 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:24:aa:7e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:15:27 np0005474864 nova_compute[192593]: 2025-10-07 20:15:27.399 2 INFO nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Using config drive#033[00m
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.334 2 INFO nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Creating config drive at /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.config#033[00m
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.344 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppem2s9cn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.486 2 DEBUG oslo_concurrency.processutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmppem2s9cn" returned: 0 in 0.142s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:15:28 np0005474864 kernel: tapc1d00195-4d: entered promiscuous mode
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.5706] manager: (tapc1d00195-4d): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00128|binding|INFO|Claiming lport c1d00195-4d32-45ac-b745-1a913060f39d for this chassis.
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00129|binding|INFO|c1d00195-4d32-45ac-b745-1a913060f39d: Claiming fa:16:3e:e1:c8:d3 10.100.0.6
Oct  7 16:15:28 np0005474864 systemd-udevd[224199]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.6220] manager: (tapfb8c9e14-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/72)
Oct  7 16:15:28 np0005474864 systemd-udevd[224204]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.631 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:c8:d3 10.100.0.6'], port_security=['fa:16:3e:e1:c8:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6a1d6d4-586d-450e-8b73-6ad134098649', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34f1dcb0-f04e-41a8-8b02-05684b457dc5, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=c1d00195-4d32-45ac-b745-1a913060f39d) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.633 103685 INFO neutron.agent.ovn.metadata.agent [-] Port c1d00195-4d32-45ac-b745-1a913060f39d in datapath d6a1d6d4-586d-450e-8b73-6ad134098649 bound to our chassis#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.637 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6a1d6d4-586d-450e-8b73-6ad134098649#033[00m
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.6405] device (tapc1d00195-4d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.6415] device (tapc1d00195-4d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.652 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[181fe368-4396-412e-bc88-3480a18ddc9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.654 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd6a1d6d4-51 in ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.658 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd6a1d6d4-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.659 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2e60ab05-8ecc-4ab2-be88-2373c5c4a978]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.660 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[29413d44-f630-4ac4-88f5-d7c14936e6a1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.676 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[ccfebba3-cede-4dc6-a8cc-d55022a43177]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 systemd-machined[152586]: New machine qemu-8-instance-00000019.
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.6988] device (tapfb8c9e14-7c): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:15:28 np0005474864 kernel: tapfb8c9e14-7c: entered promiscuous mode
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.7006] device (tapfb8c9e14-7c): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00130|binding|INFO|Claiming lport fb8c9e14-7c02-42cf-9fe9-afc4a7316794 for this chassis.
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00131|binding|INFO|fb8c9e14-7c02-42cf-9fe9-afc4a7316794: Claiming fa:16:3e:24:aa:7e 2001:db8:0:1:f816:3eff:fe24:aa7e 2001:db8::f816:3eff:fe24:aa7e
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.704 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1db58492-b779-4033-aa05-3d5d7127227a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 systemd[1]: Started Virtual Machine qemu-8-instance-00000019.
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.710 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:aa:7e 2001:db8:0:1:f816:3eff:fe24:aa7e 2001:db8::f816:3eff:fe24:aa7e'], port_security=['fa:16:3e:24:aa:7e 2001:db8:0:1:f816:3eff:fe24:aa7e 2001:db8::f816:3eff:fe24:aa7e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe24:aa7e/64 2001:db8::f816:3eff:fe24:aa7e/64', 'neutron:device_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fb1d4ce-b691-4091-872a-86df16b02e47, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=fb8c9e14-7c02-42cf-9fe9-afc4a7316794) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00132|binding|INFO|Setting lport c1d00195-4d32-45ac-b745-1a913060f39d ovn-installed in OVS
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00133|binding|INFO|Setting lport c1d00195-4d32-45ac-b745-1a913060f39d up in Southbound
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.711 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00134|binding|INFO|Setting lport fb8c9e14-7c02-42cf-9fe9-afc4a7316794 ovn-installed in OVS
Oct  7 16:15:28 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:28Z|00135|binding|INFO|Setting lport fb8c9e14-7c02-42cf-9fe9-afc4a7316794 up in Southbound
Oct  7 16:15:28 np0005474864 nova_compute[192593]: 2025-10-07 20:15:28.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.744 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[69393c5a-8a9c-4f18-88c7-8f47835b267b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.751 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[01849489-c243-4f28-a670-2cdde0821b69]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.7566] manager: (tapd6a1d6d4-50): new Veth device (/org/freedesktop/NetworkManager/Devices/73)
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.788 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[f7a1a881-db52-4577-920e-4d382daa1e71]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.791 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[aaa3306f-a004-4e3f-8e50-c81e420a1a57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 NetworkManager[51631]: <info>  [1759868128.8135] device (tapd6a1d6d4-50): carrier: link connected
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.820 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[1d599a14-cfba-412c-860f-b86a04dd758c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.845 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[917a80b5-6177-4835-b338-d71b7bb5c1ef]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6a1d6d4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:c9:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376470, 'reachable_time': 23219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224240, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.869 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8f94fc48-d1e3-4aac-bc25-d434b3c2d91e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:c972'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376470, 'tstamp': 376470}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224241, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.906 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[73a5aec0-c363-4445-a30e-f417976b847c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6a1d6d4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:c9:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376470, 'reachable_time': 23219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224242, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:28.953 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[24d9d4d8-4f03-4f91-a7be-2c9f5b654a7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.046 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a5478714-2102-4b17-a3c6-c12a295f9ef8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.048 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6a1d6d4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.048 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.049 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6a1d6d4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:29 np0005474864 NetworkManager[51631]: <info>  [1759868129.0528] manager: (tapd6a1d6d4-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/74)
Oct  7 16:15:29 np0005474864 kernel: tapd6a1d6d4-50: entered promiscuous mode
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.056 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6a1d6d4-50, col_values=(('external_ids', {'iface-id': '5cf38e83-5f07-4562-b663-4850a1d35f81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.063 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d6a1d6d4-586d-450e-8b73-6ad134098649.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d6a1d6d4-586d-450e-8b73-6ad134098649.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.065 2 DEBUG nova.compute.manager [req-86cae862-66be-423c-956b-6674e5f75fad req-f29ce3b2-83a1-447e-913b-e30e34a2e82a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.064 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[79d8c8ff-fd7e-40cb-84ec-6906eb914438]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.066 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-d6a1d6d4-586d-450e-8b73-6ad134098649
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/d6a1d6d4-586d-450e-8b73-6ad134098649.pid.haproxy
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID d6a1d6d4-586d-450e-8b73-6ad134098649
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:15:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:29Z|00136|binding|INFO|Releasing lport 5cf38e83-5f07-4562-b663-4850a1d35f81 from this chassis (sb_readonly=0)
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.067 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'env', 'PROCESS_TAG=haproxy-d6a1d6d4-586d-450e-8b73-6ad134098649', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d6a1d6d4-586d-450e-8b73-6ad134098649.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.065 2 DEBUG oslo_concurrency.lockutils [req-86cae862-66be-423c-956b-6674e5f75fad req-f29ce3b2-83a1-447e-913b-e30e34a2e82a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.067 2 DEBUG oslo_concurrency.lockutils [req-86cae862-66be-423c-956b-6674e5f75fad req-f29ce3b2-83a1-447e-913b-e30e34a2e82a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.068 2 DEBUG oslo_concurrency.lockutils [req-86cae862-66be-423c-956b-6674e5f75fad req-f29ce3b2-83a1-447e-913b-e30e34a2e82a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.068 2 DEBUG nova.compute.manager [req-86cae862-66be-423c-956b-6674e5f75fad req-f29ce3b2-83a1-447e-913b-e30e34a2e82a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Processing event network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.069 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.090 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.243 2 DEBUG nova.compute.manager [req-a3a19268-e7e1-4973-9a31-5bc3d0c62c9a req-8999f2f0-4e1f-4bfc-a6a1-98eb7bfe1864 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.244 2 DEBUG oslo_concurrency.lockutils [req-a3a19268-e7e1-4973-9a31-5bc3d0c62c9a req-8999f2f0-4e1f-4bfc-a6a1-98eb7bfe1864 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.245 2 DEBUG oslo_concurrency.lockutils [req-a3a19268-e7e1-4973-9a31-5bc3d0c62c9a req-8999f2f0-4e1f-4bfc-a6a1-98eb7bfe1864 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.249 2 DEBUG oslo_concurrency.lockutils [req-a3a19268-e7e1-4973-9a31-5bc3d0c62c9a req-8999f2f0-4e1f-4bfc-a6a1-98eb7bfe1864 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.003s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.251 2 DEBUG nova.compute.manager [req-a3a19268-e7e1-4973-9a31-5bc3d0c62c9a req-8999f2f0-4e1f-4bfc-a6a1-98eb7bfe1864 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Processing event network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:15:29 np0005474864 podman[224282]: 2025-10-07 20:15:29.503174266 +0000 UTC m=+0.072458674 container create 402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:15:29 np0005474864 systemd[1]: Started libpod-conmon-402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696.scope.
Oct  7 16:15:29 np0005474864 podman[224282]: 2025-10-07 20:15:29.462112412 +0000 UTC m=+0.031396880 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:15:29 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:15:29 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2094ad2571a4bdd633e9bccdea8d7d72b71afbb1cd7df15f928f8fdbd88565c6/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:15:29 np0005474864 podman[224282]: 2025-10-07 20:15:29.600708837 +0000 UTC m=+0.169993225 container init 402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:15:29 np0005474864 podman[224282]: 2025-10-07 20:15:29.610274561 +0000 UTC m=+0.179558929 container start 402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:15:29 np0005474864 neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649[224297]: [NOTICE]   (224301) : New worker (224303) forked
Oct  7 16:15:29 np0005474864 neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649[224297]: [NOTICE]   (224301) : Loading success.
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.655 103685 INFO neutron.agent.ovn.metadata.agent [-] Port fb8c9e14-7c02-42cf-9fe9-afc4a7316794 in datapath 99465e0c-6ee8-477a-94aa-ab737f76f9e4 unbound from our chassis#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.657 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99465e0c-6ee8-477a-94aa-ab737f76f9e4#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.670 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b11e5d0c-7858-4215-a00e-29629b2a6d08]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.671 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap99465e0c-61 in ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.673 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap99465e0c-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.674 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[84577849-af14-4502-9185-097cf820954e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.675 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e00cb10d-b7ba-4769-a55f-290c3a490416]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.684 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[853db730-c07e-4440-b395-841413dc48f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.705 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[aafa94bc-94ad-498e-bf47-b16f2464b3a4]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.730 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[7dfdeb6c-c883-40d3-ae6c-fab08bde3e55]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 NetworkManager[51631]: <info>  [1759868129.7373] manager: (tap99465e0c-60): new Veth device (/org/freedesktop/NetworkManager/Devices/75)
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.736 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e86b6cea-bfc0-4b1b-a5a5-b77ae3bb7ea8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.761 2 DEBUG nova.network.neutron [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updated VIF entry in instance network info cache for port fb8c9e14-7c02-42cf-9fe9-afc4a7316794. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.761 2 DEBUG nova.network.neutron [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updating instance_info_cache with network_info: [{"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.771 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[e545e977-e75b-4a70-b993-75b94d41be5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.774 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[e751c318-292f-431b-94cc-d3232ba49f76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.777 2 DEBUG oslo_concurrency.lockutils [req-652b3d8f-f83d-4a9e-a781-f64872c17b27 req-ed499b63-f41f-451d-ba88-d1b48112eb4a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:15:29 np0005474864 NetworkManager[51631]: <info>  [1759868129.8029] device (tap99465e0c-60): carrier: link connected
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.808 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[4fc479da-a49c-440e-bc77-20f998eb0fff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.814 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868129.813793, c491b943-fbbd-46e0-be8c-74a8c1378ab3 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.814 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] VM Started (Lifecycle Event)#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.816 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.820 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.823 2 INFO nova.virt.libvirt.driver [-] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Instance spawned successfully.#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.823 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.830 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6e232e49-1487-4d79-8113-f87faebdd903]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99465e0c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:17:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376569, 'reachable_time': 25883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224322, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.836 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.838 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.846 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[be5e878b-e7c1-4a19-a7f9-a0dbdbe8ddb0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe58:1708'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376569, 'tstamp': 376569}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224323, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.860 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.860 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.861 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.861 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.861 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.862 2 DEBUG nova.virt.libvirt.driver [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.865 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.866 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868129.8161223, c491b943-fbbd-46e0-be8c-74a8c1378ab3 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.866 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.868 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f9872bfc-e5c9-455b-9b59-7320e3567152]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99465e0c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:17:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376569, 'reachable_time': 25883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224324, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.892 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.894 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868129.82049, c491b943-fbbd-46e0-be8c-74a8c1378ab3 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.894 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.896 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[99801716-92a0-4791-88f0-2b0eb12ef7f8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.916 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.918 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.924 2 INFO nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Took 12.41 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.924 2 DEBUG nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.933 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.935 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9b6ed5ee-0abb-47f9-9ed5-e8e83f945f8e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.936 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99465e0c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.936 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.936 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99465e0c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:29 np0005474864 NetworkManager[51631]: <info>  [1759868129.9385] manager: (tap99465e0c-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Oct  7 16:15:29 np0005474864 kernel: tap99465e0c-60: entered promiscuous mode
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.940 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99465e0c-60, col_values=(('external_ids', {'iface-id': '3fba40e3-39d9-4871-b9b9-3e2e5088af4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:29Z|00137|binding|INFO|Releasing lport 3fba40e3-39d9-4871-b9b9-3e2e5088af4f from this chassis (sb_readonly=0)
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.955 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/99465e0c-6ee8-477a-94aa-ab737f76f9e4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/99465e0c-6ee8-477a-94aa-ab737f76f9e4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.956 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[af24a4c2-a4be-4f6f-92d1-c1ae548f8cf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.957 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-99465e0c-6ee8-477a-94aa-ab737f76f9e4
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/99465e0c-6ee8-477a-94aa-ab737f76f9e4.pid.haproxy
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 99465e0c-6ee8-477a-94aa-ab737f76f9e4
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:15:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:29.957 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'env', 'PROCESS_TAG=haproxy-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/99465e0c-6ee8-477a-94aa-ab737f76f9e4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.978 2 INFO nova.compute.manager [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Took 12.98 seconds to build instance.#033[00m
Oct  7 16:15:29 np0005474864 nova_compute[192593]: 2025-10-07 20:15:29.992 2 DEBUG oslo_concurrency.lockutils [None req-dfbf3eda-7960-4fc7-8d0c-e3359172cbc0 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.087s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:30 np0005474864 podman[224354]: 2025-10-07 20:15:30.365549282 +0000 UTC m=+0.059517094 container create 6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:15:30 np0005474864 systemd[1]: Started libpod-conmon-6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce.scope.
Oct  7 16:15:30 np0005474864 podman[224354]: 2025-10-07 20:15:30.33714796 +0000 UTC m=+0.031115812 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:15:30 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:15:30 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5e49f475b67c5c81ba39c9e68fdd287f05ea08d70167bb01844cd69ee5668480/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:15:30 np0005474864 podman[224354]: 2025-10-07 20:15:30.456054332 +0000 UTC m=+0.150022194 container init 6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:15:30 np0005474864 podman[224354]: 2025-10-07 20:15:30.464947237 +0000 UTC m=+0.158915059 container start 6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Oct  7 16:15:30 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [NOTICE]   (224393) : New worker (224398) forked
Oct  7 16:15:30 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [NOTICE]   (224393) : Loading success.
Oct  7 16:15:30 np0005474864 podman[224372]: 2025-10-07 20:15:30.516340427 +0000 UTC m=+0.075175692 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:15:30 np0005474864 podman[224373]: 2025-10-07 20:15:30.538937394 +0000 UTC m=+0.099868389 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, vendor=Red Hat, Inc., name=ubi9-minimal, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, version=9.6, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, distribution-scope=public, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.212 2 DEBUG nova.compute.manager [req-9d5491b3-6b10-4dc7-877c-47d700c04a73 req-a48182b3-dab5-43d7-9de9-d513c41344bc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.212 2 DEBUG oslo_concurrency.lockutils [req-9d5491b3-6b10-4dc7-877c-47d700c04a73 req-a48182b3-dab5-43d7-9de9-d513c41344bc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.213 2 DEBUG oslo_concurrency.lockutils [req-9d5491b3-6b10-4dc7-877c-47d700c04a73 req-a48182b3-dab5-43d7-9de9-d513c41344bc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.213 2 DEBUG oslo_concurrency.lockutils [req-9d5491b3-6b10-4dc7-877c-47d700c04a73 req-a48182b3-dab5-43d7-9de9-d513c41344bc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.214 2 DEBUG nova.compute.manager [req-9d5491b3-6b10-4dc7-877c-47d700c04a73 req-a48182b3-dab5-43d7-9de9-d513c41344bc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] No waiting events found dispatching network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.214 2 WARNING nova.compute.manager [req-9d5491b3-6b10-4dc7-877c-47d700c04a73 req-a48182b3-dab5-43d7-9de9-d513c41344bc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received unexpected event network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.336 2 DEBUG nova.compute.manager [req-8589c71c-d1ae-4a30-8632-6d5756f8cd50 req-c740cf75-0b9c-4299-969b-423c9d2d5830 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.336 2 DEBUG oslo_concurrency.lockutils [req-8589c71c-d1ae-4a30-8632-6d5756f8cd50 req-c740cf75-0b9c-4299-969b-423c9d2d5830 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.337 2 DEBUG oslo_concurrency.lockutils [req-8589c71c-d1ae-4a30-8632-6d5756f8cd50 req-c740cf75-0b9c-4299-969b-423c9d2d5830 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.337 2 DEBUG oslo_concurrency.lockutils [req-8589c71c-d1ae-4a30-8632-6d5756f8cd50 req-c740cf75-0b9c-4299-969b-423c9d2d5830 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.337 2 DEBUG nova.compute.manager [req-8589c71c-d1ae-4a30-8632-6d5756f8cd50 req-c740cf75-0b9c-4299-969b-423c9d2d5830 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] No waiting events found dispatching network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:15:31 np0005474864 nova_compute[192593]: 2025-10-07 20:15:31.338 2 WARNING nova.compute.manager [req-8589c71c-d1ae-4a30-8632-6d5756f8cd50 req-c740cf75-0b9c-4299-969b-423c9d2d5830 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received unexpected event network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d for instance with vm_state active and task_state None.#033[00m
Oct  7 16:15:32 np0005474864 nova_compute[192593]: 2025-10-07 20:15:32.312 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:33 np0005474864 nova_compute[192593]: 2025-10-07 20:15:33.143 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:34 np0005474864 NetworkManager[51631]: <info>  [1759868134.2508] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Oct  7 16:15:34 np0005474864 NetworkManager[51631]: <info>  [1759868134.2526] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/78)
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:34Z|00138|binding|INFO|Releasing lport 3fba40e3-39d9-4871-b9b9-3e2e5088af4f from this chassis (sb_readonly=0)
Oct  7 16:15:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:34Z|00139|binding|INFO|Releasing lport 5cf38e83-5f07-4562-b663-4850a1d35f81 from this chassis (sb_readonly=0)
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.453 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.887 2 DEBUG nova.compute.manager [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-changed-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.888 2 DEBUG nova.compute.manager [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing instance network info cache due to event network-changed-c1d00195-4d32-45ac-b745-1a913060f39d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.888 2 DEBUG oslo_concurrency.lockutils [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.889 2 DEBUG oslo_concurrency.lockutils [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:15:34 np0005474864 nova_compute[192593]: 2025-10-07 20:15:34.889 2 DEBUG nova.network.neutron [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing network info cache for port c1d00195-4d32-45ac-b745-1a913060f39d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:15:36 np0005474864 podman[224426]: 2025-10-07 20:15:36.408854541 +0000 UTC m=+0.090396268 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:15:36 np0005474864 podman[224428]: 2025-10-07 20:15:36.420910516 +0000 UTC m=+0.091297973 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:15:36 np0005474864 podman[224427]: 2025-10-07 20:15:36.463597187 +0000 UTC m=+0.136226859 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  7 16:15:36 np0005474864 nova_compute[192593]: 2025-10-07 20:15:36.516 2 DEBUG nova.network.neutron [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updated VIF entry in instance network info cache for port c1d00195-4d32-45ac-b745-1a913060f39d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:15:36 np0005474864 nova_compute[192593]: 2025-10-07 20:15:36.517 2 DEBUG nova.network.neutron [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updating instance_info_cache with network_info: [{"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:15:36 np0005474864 nova_compute[192593]: 2025-10-07 20:15:36.664 2 DEBUG oslo_concurrency.lockutils [req-dc35f3c0-6d03-4e53-ab40-d0013dbb181f req-0c8b929c-9dd3-4bc4-a4d3-6ff769b8753d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:15:37 np0005474864 nova_compute[192593]: 2025-10-07 20:15:37.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:38.031 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:11:b6:fe'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '', 'neutron:device_id': 'ovnmeta-4cce2687-5124-4981-a1c7-e08b7f3b1883', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4cce2687-5124-4981-a1c7-e08b7f3b1883', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=cfa24498-27c9-418c-8273-72b867a95d39, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=55d12e9a-2457-4743-aa5f-f4bdfdbc1b45) old=Port_Binding(mac=['fa:16:3e:11:b6:fe 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-4cce2687-5124-4981-a1c7-e08b7f3b1883', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-4cce2687-5124-4981-a1c7-e08b7f3b1883', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:15:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:38.032 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 55d12e9a-2457-4743-aa5f-f4bdfdbc1b45 in datapath 4cce2687-5124-4981-a1c7-e08b7f3b1883 updated#033[00m
Oct  7 16:15:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:38.033 103685 DEBUG neutron.agent.ovn.metadata.agent [-] There is no metadata port for network 4cce2687-5124-4981-a1c7-e08b7f3b1883 or it has no MAC or IP addresses configured, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:599#033[00m
Oct  7 16:15:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:38.034 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[dfd1fddf-7139-4e1d-a05e-b458647a5f53]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:15:38 np0005474864 nova_compute[192593]: 2025-10-07 20:15:38.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:38 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:38Z|00140|binding|INFO|Releasing lport 3fba40e3-39d9-4871-b9b9-3e2e5088af4f from this chassis (sb_readonly=0)
Oct  7 16:15:38 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:38Z|00141|binding|INFO|Releasing lport 5cf38e83-5f07-4562-b663-4850a1d35f81 from this chassis (sb_readonly=0)
Oct  7 16:15:38 np0005474864 nova_compute[192593]: 2025-10-07 20:15:38.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:39Z|00142|binding|INFO|Releasing lport 3fba40e3-39d9-4871-b9b9-3e2e5088af4f from this chassis (sb_readonly=0)
Oct  7 16:15:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:39Z|00143|binding|INFO|Releasing lport 5cf38e83-5f07-4562-b663-4850a1d35f81 from this chassis (sb_readonly=0)
Oct  7 16:15:39 np0005474864 nova_compute[192593]: 2025-10-07 20:15:39.160 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:39 np0005474864 nova_compute[192593]: 2025-10-07 20:15:39.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:41 np0005474864 nova_compute[192593]: 2025-10-07 20:15:41.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:42 np0005474864 nova_compute[192593]: 2025-10-07 20:15:42.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:42 np0005474864 podman[224504]: 2025-10-07 20:15:42.389591269 +0000 UTC m=+0.075124820 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 16:15:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:42Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e1:c8:d3 10.100.0.6
Oct  7 16:15:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:15:42Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e1:c8:d3 10.100.0.6
Oct  7 16:15:43 np0005474864 nova_compute[192593]: 2025-10-07 20:15:43.202 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:45 np0005474864 podman[224524]: 2025-10-07 20:15:45.382506757 +0000 UTC m=+0.070937921 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:15:46 np0005474864 nova_compute[192593]: 2025-10-07 20:15:46.933 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:47 np0005474864 nova_compute[192593]: 2025-10-07 20:15:47.318 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:48 np0005474864 nova_compute[192593]: 2025-10-07 20:15:48.242 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:48 np0005474864 podman[224548]: 2025-10-07 20:15:48.389637741 +0000 UTC m=+0.088548085 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, config_id=edpm, org.label-schema.license=GPLv2)
Oct  7 16:15:52 np0005474864 nova_compute[192593]: 2025-10-07 20:15:52.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:52.330 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:15:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:52.331 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:15:52 np0005474864 nova_compute[192593]: 2025-10-07 20:15:52.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:53 np0005474864 nova_compute[192593]: 2025-10-07 20:15:53.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:15:55.333 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:15:57 np0005474864 nova_compute[192593]: 2025-10-07 20:15:57.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:15:58 np0005474864 nova_compute[192593]: 2025-10-07 20:15:58.357 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:00 np0005474864 nova_compute[192593]: 2025-10-07 20:16:00.752 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:00 np0005474864 nova_compute[192593]: 2025-10-07 20:16:00.753 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:00 np0005474864 nova_compute[192593]: 2025-10-07 20:16:00.771 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:16:00 np0005474864 nova_compute[192593]: 2025-10-07 20:16:00.855 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:00 np0005474864 nova_compute[192593]: 2025-10-07 20:16:00.856 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:00 np0005474864 nova_compute[192593]: 2025-10-07 20:16:00.871 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:16:00 np0005474864 nova_compute[192593]: 2025-10-07 20:16:00.871 2 INFO nova.compute.claims [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.014 2 DEBUG nova.compute.provider_tree [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.038 2 DEBUG nova.scheduler.client.report [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.065 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.066 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.129 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.130 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.153 2 INFO nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.182 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.284 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.286 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.287 2 INFO nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Creating image(s)#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.288 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "/var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.288 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.290 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.315 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.374 2 DEBUG nova.policy [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.420 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.105s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.421 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.422 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:01 np0005474864 podman[224569]: 2025-10-07 20:16:01.42519153 +0000 UTC m=+0.099450017 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., architecture=x86_64, vcs-type=git, version=9.6, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.buildah.version=1.33.7)
Oct  7 16:16:01 np0005474864 podman[224568]: 2025-10-07 20:16:01.429418591 +0000 UTC m=+0.111336027 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.444 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.536 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.539 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.578 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk 1073741824" returned: 0 in 0.039s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.580 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.582 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.644 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.646 2 DEBUG nova.virt.disk.api [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Checking if we can resize image /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.647 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.707 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.709 2 DEBUG nova.virt.disk.api [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Cannot resize image /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.710 2 DEBUG nova.objects.instance [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'migration_context' on Instance uuid d37e7dbd-01fd-4484-9bdb-f09d24420fa7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.730 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.730 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Ensure instance console log exists: /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.731 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.732 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:01 np0005474864 nova_compute[192593]: 2025-10-07 20:16:01.732 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:02 np0005474864 nova_compute[192593]: 2025-10-07 20:16:02.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:03 np0005474864 nova_compute[192593]: 2025-10-07 20:16:03.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:03 np0005474864 nova_compute[192593]: 2025-10-07 20:16:03.436 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Successfully created port: b88a1e01-9e7f-49da-9f11-434972486fb7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:16:05 np0005474864 nova_compute[192593]: 2025-10-07 20:16:05.729 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Successfully created port: 9315ca92-32fd-4407-a73c-c7c9440c29b8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:16:07 np0005474864 nova_compute[192593]: 2025-10-07 20:16:07.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:07 np0005474864 podman[224625]: 2025-10-07 20:16:07.436354599 +0000 UTC m=+0.121272181 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  7 16:16:07 np0005474864 podman[224627]: 2025-10-07 20:16:07.43953255 +0000 UTC m=+0.109338009 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=multipathd, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:16:07 np0005474864 podman[224626]: 2025-10-07 20:16:07.471083413 +0000 UTC m=+0.145093473 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.192 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Successfully updated port: b88a1e01-9e7f-49da-9f11-434972486fb7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.256 2 DEBUG nova.compute.manager [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-changed-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.256 2 DEBUG nova.compute.manager [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing instance network info cache due to event network-changed-b88a1e01-9e7f-49da-9f11-434972486fb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.257 2 DEBUG oslo_concurrency.lockutils [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.257 2 DEBUG oslo_concurrency.lockutils [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.257 2 DEBUG nova.network.neutron [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing network info cache for port b88a1e01-9e7f-49da-9f11-434972486fb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:08 np0005474864 nova_compute[192593]: 2025-10-07 20:16:08.477 2 DEBUG nova.network.neutron [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:16:09 np0005474864 nova_compute[192593]: 2025-10-07 20:16:09.434 2 DEBUG nova.network.neutron [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:09 np0005474864 nova_compute[192593]: 2025-10-07 20:16:09.460 2 DEBUG oslo_concurrency.lockutils [req-aa0b6a7c-11e8-4af4-b992-8a0d717f1fb6 req-dbe55a6c-77d0-484b-983f-e3c5ddef9e0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:09 np0005474864 nova_compute[192593]: 2025-10-07 20:16:09.522 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Successfully updated port: 9315ca92-32fd-4407-a73c-c7c9440c29b8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:16:09 np0005474864 nova_compute[192593]: 2025-10-07 20:16:09.539 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:09 np0005474864 nova_compute[192593]: 2025-10-07 20:16:09.540 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquired lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:09 np0005474864 nova_compute[192593]: 2025-10-07 20:16:09.540 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:16:09 np0005474864 nova_compute[192593]: 2025-10-07 20:16:09.785 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:16:10 np0005474864 nova_compute[192593]: 2025-10-07 20:16:10.415 2 DEBUG nova.compute.manager [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-changed-9315ca92-32fd-4407-a73c-c7c9440c29b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:10 np0005474864 nova_compute[192593]: 2025-10-07 20:16:10.416 2 DEBUG nova.compute.manager [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing instance network info cache due to event network-changed-9315ca92-32fd-4407-a73c-c7c9440c29b8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:16:10 np0005474864 nova_compute[192593]: 2025-10-07 20:16:10.416 2 DEBUG oslo_concurrency.lockutils [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:12 np0005474864 nova_compute[192593]: 2025-10-07 20:16:12.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.378 2 DEBUG nova.network.neutron [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updating instance_info_cache with network_info: [{"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:13 np0005474864 podman[224690]: 2025-10-07 20:16:13.385975039 +0000 UTC m=+0.077089167 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.404 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.410 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Releasing lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.411 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Instance network_info: |[{"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.411 2 DEBUG oslo_concurrency.lockutils [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.412 2 DEBUG nova.network.neutron [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing network info cache for port 9315ca92-32fd-4407-a73c-c7c9440c29b8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.417 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Start _get_guest_xml network_info=[{"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.422 2 WARNING nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.428 2 DEBUG nova.virt.libvirt.host [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.428 2 DEBUG nova.virt.libvirt.host [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.433 2 DEBUG nova.virt.libvirt.host [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.433 2 DEBUG nova.virt.libvirt.host [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.434 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.434 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.435 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.435 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.435 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.435 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.435 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.435 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.436 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.436 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.436 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.436 2 DEBUG nova.virt.hardware [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.440 2 DEBUG nova.virt.libvirt.vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1118891772',display_name='tempest-TestGettingAddress-server-1118891772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1118891772',id=28,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-pk9arxda',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:16:01Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d37e7dbd-01fd-4484-9bdb-f09d24420fa7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.441 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.441 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:2b:b7,bridge_name='br-int',has_traffic_filtering=True,id=b88a1e01-9e7f-49da-9f11-434972486fb7,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb88a1e01-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.442 2 DEBUG nova.virt.libvirt.vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1118891772',display_name='tempest-TestGettingAddress-server-1118891772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1118891772',id=28,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-pk9arxda',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:16:01Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d37e7dbd-01fd-4484-9bdb-f09d24420fa7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.442 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.443 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:12:14,bridge_name='br-int',has_traffic_filtering=True,id=9315ca92-32fd-4407-a73c-c7c9440c29b8,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9315ca92-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.444 2 DEBUG nova.objects.instance [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid d37e7dbd-01fd-4484-9bdb-f09d24420fa7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.462 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <uuid>d37e7dbd-01fd-4484-9bdb-f09d24420fa7</uuid>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <name>instance-0000001c</name>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestGettingAddress-server-1118891772</nova:name>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:16:13</nova:creationTime>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:user uuid="334f092941fc46c496c7def76b2cfe18">tempest-TestGettingAddress-626136673-project-member</nova:user>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:project uuid="2f9bf744045540618c9980fd4a7694f5">tempest-TestGettingAddress-626136673</nova:project>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:port uuid="b88a1e01-9e7f-49da-9f11-434972486fb7">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        <nova:port uuid="9315ca92-32fd-4407-a73c-c7c9440c29b8">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe72:1214" ipVersion="6"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe72:1214" ipVersion="6"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <entry name="serial">d37e7dbd-01fd-4484-9bdb-f09d24420fa7</entry>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <entry name="uuid">d37e7dbd-01fd-4484-9bdb-f09d24420fa7</entry>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.config"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:a0:2b:b7"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <target dev="tapb88a1e01-9e"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:72:12:14"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <target dev="tap9315ca92-32"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/console.log" append="off"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:16:13 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:16:13 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:16:13 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:16:13 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.462 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Preparing to wait for external event network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.462 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.463 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.463 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.463 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Preparing to wait for external event network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.463 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.463 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.463 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.464 2 DEBUG nova.virt.libvirt.vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1118891772',display_name='tempest-TestGettingAddress-server-1118891772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1118891772',id=28,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-pk9arxda',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:16:01Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d37e7dbd-01fd-4484-9bdb-f09d24420fa7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.464 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.465 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:a0:2b:b7,bridge_name='br-int',has_traffic_filtering=True,id=b88a1e01-9e7f-49da-9f11-434972486fb7,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb88a1e01-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.465 2 DEBUG os_vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:2b:b7,bridge_name='br-int',has_traffic_filtering=True,id=b88a1e01-9e7f-49da-9f11-434972486fb7,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb88a1e01-9e') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.466 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.466 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.471 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.472 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb88a1e01-9e, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.472 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapb88a1e01-9e, col_values=(('external_ids', {'iface-id': 'b88a1e01-9e7f-49da-9f11-434972486fb7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:a0:2b:b7', 'vm-uuid': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 NetworkManager[51631]: <info>  [1759868173.4790] manager: (tapb88a1e01-9e): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/79)
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.479 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.487 2 INFO os_vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:a0:2b:b7,bridge_name='br-int',has_traffic_filtering=True,id=b88a1e01-9e7f-49da-9f11-434972486fb7,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb88a1e01-9e')#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.489 2 DEBUG nova.virt.libvirt.vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:15:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1118891772',display_name='tempest-TestGettingAddress-server-1118891772',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1118891772',id=28,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-pk9arxda',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:16:01Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d37e7dbd-01fd-4484-9bdb-f09d24420fa7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.489 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.491 2 DEBUG nova.network.os_vif_util [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:12:14,bridge_name='br-int',has_traffic_filtering=True,id=9315ca92-32fd-4407-a73c-c7c9440c29b8,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9315ca92-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.492 2 DEBUG os_vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:12:14,bridge_name='br-int',has_traffic_filtering=True,id=9315ca92-32fd-4407-a73c-c7c9440c29b8,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9315ca92-32') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.492 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.493 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.493 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.497 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9315ca92-32, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.497 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9315ca92-32, col_values=(('external_ids', {'iface-id': '9315ca92-32fd-4407-a73c-c7c9440c29b8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:72:12:14', 'vm-uuid': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:16:13 np0005474864 NetworkManager[51631]: <info>  [1759868173.5017] manager: (tap9315ca92-32): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.510 2 INFO os_vif [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:12:14,bridge_name='br-int',has_traffic_filtering=True,id=9315ca92-32fd-4407-a73c-c7c9440c29b8,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9315ca92-32')#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.585 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.585 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.586 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:a0:2b:b7, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.586 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:72:12:14, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:16:13 np0005474864 nova_compute[192593]: 2025-10-07 20:16:13.586 2 INFO nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Using config drive#033[00m
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.316 2 INFO nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Creating config drive at /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.config#033[00m
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.322 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf5uyt0xg execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.454 2 DEBUG oslo_concurrency.processutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf5uyt0xg" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:14 np0005474864 kernel: tapb88a1e01-9e: entered promiscuous mode
Oct  7 16:16:14 np0005474864 NetworkManager[51631]: <info>  [1759868174.5130] manager: (tapb88a1e01-9e): new Tun device (/org/freedesktop/NetworkManager/Devices/81)
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00144|binding|INFO|Claiming lport b88a1e01-9e7f-49da-9f11-434972486fb7 for this chassis.
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00145|binding|INFO|b88a1e01-9e7f-49da-9f11-434972486fb7: Claiming fa:16:3e:a0:2b:b7 10.100.0.9
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.519 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.533 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:2b:b7 10.100.0.9'], port_security=['fa:16:3e:a0:2b:b7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6a1d6d4-586d-450e-8b73-6ad134098649', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34f1dcb0-f04e-41a8-8b02-05684b457dc5, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=b88a1e01-9e7f-49da-9f11-434972486fb7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.535 103685 INFO neutron.agent.ovn.metadata.agent [-] Port b88a1e01-9e7f-49da-9f11-434972486fb7 in datapath d6a1d6d4-586d-450e-8b73-6ad134098649 bound to our chassis#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.538 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6a1d6d4-586d-450e-8b73-6ad134098649#033[00m
Oct  7 16:16:14 np0005474864 NetworkManager[51631]: <info>  [1759868174.5402] manager: (tap9315ca92-32): new Tun device (/org/freedesktop/NetworkManager/Devices/82)
Oct  7 16:16:14 np0005474864 kernel: tap9315ca92-32: entered promiscuous mode
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00146|binding|INFO|Setting lport b88a1e01-9e7f-49da-9f11-434972486fb7 ovn-installed in OVS
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00147|binding|INFO|Setting lport b88a1e01-9e7f-49da-9f11-434972486fb7 up in Southbound
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00148|if_status|INFO|Not updating pb chassis for 9315ca92-32fd-4407-a73c-c7c9440c29b8 now as sb is readonly
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.548 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00149|binding|INFO|Claiming lport 9315ca92-32fd-4407-a73c-c7c9440c29b8 for this chassis.
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00150|binding|INFO|9315ca92-32fd-4407-a73c-c7c9440c29b8: Claiming fa:16:3e:72:12:14 2001:db8:0:1:f816:3eff:fe72:1214 2001:db8::f816:3eff:fe72:1214
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.562 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[de47f349-68c8-4e20-a1ee-10102be95368]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 systemd-udevd[224737]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:16:14 np0005474864 systemd-udevd[224739]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.568 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:12:14 2001:db8:0:1:f816:3eff:fe72:1214 2001:db8::f816:3eff:fe72:1214'], port_security=['fa:16:3e:72:12:14 2001:db8:0:1:f816:3eff:fe72:1214 2001:db8::f816:3eff:fe72:1214'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe72:1214/64 2001:db8::f816:3eff:fe72:1214/64', 'neutron:device_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fb1d4ce-b691-4091-872a-86df16b02e47, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=9315ca92-32fd-4407-a73c-c7c9440c29b8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00151|binding|INFO|Setting lport 9315ca92-32fd-4407-a73c-c7c9440c29b8 ovn-installed in OVS
Oct  7 16:16:14 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:14Z|00152|binding|INFO|Setting lport 9315ca92-32fd-4407-a73c-c7c9440c29b8 up in Southbound
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 systemd-machined[152586]: New machine qemu-9-instance-0000001c.
Oct  7 16:16:14 np0005474864 NetworkManager[51631]: <info>  [1759868174.5855] device (tap9315ca92-32): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:16:14 np0005474864 NetworkManager[51631]: <info>  [1759868174.5871] device (tapb88a1e01-9e): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:16:14 np0005474864 NetworkManager[51631]: <info>  [1759868174.5883] device (tap9315ca92-32): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:16:14 np0005474864 NetworkManager[51631]: <info>  [1759868174.5890] device (tapb88a1e01-9e): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:16:14 np0005474864 systemd[1]: Started Virtual Machine qemu-9-instance-0000001c.
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.608 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[ceecf156-57a8-4bf0-a438-8b5b24fc088c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.611 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[0228bc36-5dfe-40d2-ad2a-ee0dd35e56e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.652 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[04b35664-dd59-45cd-abae-393a0c98a85d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.674 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0f54855e-eba1-450f-ba97-a150dcd09036]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6a1d6d4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:c9:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 10, 'tx_packets': 6, 'rx_bytes': 916, 'tx_bytes': 444, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376470, 'reachable_time': 23219, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224754, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.692 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8cfe5c5e-b137-4f83-ab0e-8f934c0769e2]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd6a1d6d4-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376488, 'tstamp': 376488}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224755, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd6a1d6d4-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376493, 'tstamp': 376493}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224755, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.694 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6a1d6d4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.765 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6a1d6d4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.766 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.767 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6a1d6d4-50, col_values=(('external_ids', {'iface-id': '5cf38e83-5f07-4562-b663-4850a1d35f81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.767 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.769 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 9315ca92-32fd-4407-a73c-c7c9440c29b8 in datapath 99465e0c-6ee8-477a-94aa-ab737f76f9e4 unbound from our chassis#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.771 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99465e0c-6ee8-477a-94aa-ab737f76f9e4#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.794 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[41a178fe-6f3c-4cbb-bbeb-e7a1590f71bc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.828 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[971921b8-9901-4795-b4f5-959094f4f667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.833 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[f14132ed-6df3-478b-922e-11859748a986]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.868 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[423ea16f-d61d-4ac3-894c-1e37b842cf20]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.898 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[819f9623-e33d-453a-93f4-29ffe218fa40]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99465e0c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:17:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 23, 'tx_packets': 5, 'rx_bytes': 2146, 'tx_bytes': 398, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 23, 'tx_packets': 5, 'rx_bytes': 2146, 'tx_bytes': 398, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376569, 'reachable_time': 25883, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 23, 'inoctets': 1824, 'indelivers': 7, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 23, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 1824, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 23, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 7, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224761, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.925 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a2bb48-547b-442a-af1d-e90a129980e7]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap99465e0c-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376582, 'tstamp': 376582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224762, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.927 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99465e0c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.929 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 nova_compute[192593]: 2025-10-07 20:16:14.930 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.930 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99465e0c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.931 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.931 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99465e0c-60, col_values=(('external_ids', {'iface-id': '3fba40e3-39d9-4871-b9b9-3e2e5088af4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:14 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:14.931 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.694 2 DEBUG nova.compute.manager [req-e015320c-9207-43e7-8b22-d5dea62826b9 req-d6b01c43-9d1e-483b-9255-78d2d98da922 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.695 2 DEBUG oslo_concurrency.lockutils [req-e015320c-9207-43e7-8b22-d5dea62826b9 req-d6b01c43-9d1e-483b-9255-78d2d98da922 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.695 2 DEBUG oslo_concurrency.lockutils [req-e015320c-9207-43e7-8b22-d5dea62826b9 req-d6b01c43-9d1e-483b-9255-78d2d98da922 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.696 2 DEBUG oslo_concurrency.lockutils [req-e015320c-9207-43e7-8b22-d5dea62826b9 req-d6b01c43-9d1e-483b-9255-78d2d98da922 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.696 2 DEBUG nova.compute.manager [req-e015320c-9207-43e7-8b22-d5dea62826b9 req-d6b01c43-9d1e-483b-9255-78d2d98da922 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Processing event network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.767 2 DEBUG nova.network.neutron [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updated VIF entry in instance network info cache for port 9315ca92-32fd-4407-a73c-c7c9440c29b8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.768 2 DEBUG nova.network.neutron [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updating instance_info_cache with network_info: [{"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.785 2 DEBUG oslo_concurrency.lockutils [req-7cb47b4e-e9b2-4dc5-8059-e4fc60a74009 req-2907f7fa-e1a9-489e-99d2-99d2fd48df50 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.837 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868175.8363864, d37e7dbd-01fd-4484-9bdb-f09d24420fa7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.837 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] VM Started (Lifecycle Event)#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.869 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.874 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868175.8369215, d37e7dbd-01fd-4484-9bdb-f09d24420fa7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.875 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.901 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.903 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:16:15 np0005474864 nova_compute[192593]: 2025-10-07 20:16:15.926 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:16:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:16.190 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:16.191 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:16.192 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:16 np0005474864 podman[224771]: 2025-10-07 20:16:16.420045204 +0000 UTC m=+0.100517688 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.807 2 DEBUG nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.807 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.808 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.808 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.808 2 DEBUG nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] No event matching network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 in dict_keys([('network-vif-plugged', '9315ca92-32fd-4407-a73c-c7c9440c29b8')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.809 2 WARNING nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received unexpected event network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.809 2 DEBUG nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.810 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.810 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.811 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.811 2 DEBUG nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Processing event network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.811 2 DEBUG nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.812 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.812 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.813 2 DEBUG oslo_concurrency.lockutils [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.813 2 DEBUG nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] No waiting events found dispatching network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.813 2 WARNING nova.compute.manager [req-d68bf000-2474-4f07-ab5b-1eaaa70d3059 req-401742d9-48e5-4a39-91f3-23e4a4852233 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received unexpected event network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.815 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.820 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868177.8189921, d37e7dbd-01fd-4484-9bdb-f09d24420fa7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.820 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.823 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.826 2 INFO nova.virt.libvirt.driver [-] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Instance spawned successfully.#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.826 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.855 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.860 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.871 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.872 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.872 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.873 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.873 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.874 2 DEBUG nova.virt.libvirt.driver [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.910 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.971 2 INFO nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Took 16.69 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:16:17 np0005474864 nova_compute[192593]: 2025-10-07 20:16:17.971 2 DEBUG nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:18 np0005474864 nova_compute[192593]: 2025-10-07 20:16:18.059 2 INFO nova.compute.manager [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Took 17.24 seconds to build instance.#033[00m
Oct  7 16:16:18 np0005474864 nova_compute[192593]: 2025-10-07 20:16:18.081 2 DEBUG oslo_concurrency.lockutils [None req-99fdf44e-d1d6-4846-ae4d-814081db3c1e 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.329s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:18 np0005474864 nova_compute[192593]: 2025-10-07 20:16:18.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:18 np0005474864 nova_compute[192593]: 2025-10-07 20:16:18.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:19 np0005474864 podman[224796]: 2025-10-07 20:16:19.421900296 +0000 UTC m=+0.110133372 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:16:21 np0005474864 nova_compute[192593]: 2025-10-07 20:16:21.405 2 DEBUG nova.compute.manager [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-changed-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:21 np0005474864 nova_compute[192593]: 2025-10-07 20:16:21.405 2 DEBUG nova.compute.manager [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing instance network info cache due to event network-changed-b88a1e01-9e7f-49da-9f11-434972486fb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:16:21 np0005474864 nova_compute[192593]: 2025-10-07 20:16:21.406 2 DEBUG oslo_concurrency.lockutils [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:21 np0005474864 nova_compute[192593]: 2025-10-07 20:16:21.406 2 DEBUG oslo_concurrency.lockutils [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:21 np0005474864 nova_compute[192593]: 2025-10-07 20:16:21.407 2 DEBUG nova.network.neutron [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing network info cache for port b88a1e01-9e7f-49da-9f11-434972486fb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.120 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.121 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.121 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.122 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.194 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.279 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.282 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.363 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.375 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.445 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.447 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.514 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.767 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.769 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5447MB free_disk=73.4338607788086GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.769 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.770 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.842 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance c491b943-fbbd-46e0-be8c-74a8c1378ab3 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.843 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance d37e7dbd-01fd-4484-9bdb-f09d24420fa7 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.843 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.843 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.902 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.917 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.938 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:16:22 np0005474864 nova_compute[192593]: 2025-10-07 20:16:22.938 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.169s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:23 np0005474864 nova_compute[192593]: 2025-10-07 20:16:23.452 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:23 np0005474864 nova_compute[192593]: 2025-10-07 20:16:23.501 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:23 np0005474864 nova_compute[192593]: 2025-10-07 20:16:23.942 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.294 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.295 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquired lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.295 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  7 16:16:24 np0005474864 nova_compute[192593]: 2025-10-07 20:16:24.296 2 DEBUG nova.objects.instance [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid c491b943-fbbd-46e0-be8c-74a8c1378ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:16:25 np0005474864 nova_compute[192593]: 2025-10-07 20:16:25.492 2 DEBUG nova.network.neutron [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updated VIF entry in instance network info cache for port b88a1e01-9e7f-49da-9f11-434972486fb7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:16:25 np0005474864 nova_compute[192593]: 2025-10-07 20:16:25.498 2 DEBUG nova.network.neutron [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updating instance_info_cache with network_info: [{"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:25 np0005474864 nova_compute[192593]: 2025-10-07 20:16:25.534 2 DEBUG oslo_concurrency.lockutils [req-79f1d283-0bf6-4830-acc6-55c1b5399940 req-3853eae8-e98c-4e01-a496-f9ea6cab9940 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.441 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.442 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.465 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.560 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.561 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.570 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.571 2 INFO nova.compute.claims [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.782 2 DEBUG nova.compute.provider_tree [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.802 2 DEBUG nova.scheduler.client.report [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.830 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.270s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.831 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.903 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.904 2 DEBUG nova.network.neutron [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.933 2 INFO nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:16:26 np0005474864 nova_compute[192593]: 2025-10-07 20:16:26.959 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.104 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.107 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.108 2 INFO nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Creating image(s)#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.109 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "/var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.109 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "/var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.110 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "/var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.144 2 DEBUG nova.policy [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab16a639b2af44c7bc4218a1b1b91068', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b69ac5dc2b44912af0aa0671c7e3696', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.150 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.245 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.247 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.248 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.274 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.365 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.366 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.426 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk 1073741824" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.427 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.179s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.428 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.517 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.519 2 DEBUG nova.virt.disk.api [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Checking if we can resize image /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.519 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.545 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updating instance_info_cache with network_info: [{"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.579 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Releasing lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.579 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.580 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.581 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.581 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.582 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.582 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.589 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.590 2 DEBUG nova.virt.disk.api [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Cannot resize image /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.591 2 DEBUG nova.objects.instance [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lazy-loading 'migration_context' on Instance uuid 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.608 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.608 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Ensure instance console log exists: /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.609 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.610 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:27 np0005474864 nova_compute[192593]: 2025-10-07 20:16:27.611 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:28 np0005474864 nova_compute[192593]: 2025-10-07 20:16:28.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:16:28 np0005474864 nova_compute[192593]: 2025-10-07 20:16:28.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:28 np0005474864 nova_compute[192593]: 2025-10-07 20:16:28.503 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:28 np0005474864 nova_compute[192593]: 2025-10-07 20:16:28.507 2 DEBUG nova.network.neutron [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Successfully created port: 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:16:30 np0005474864 nova_compute[192593]: 2025-10-07 20:16:30.456 2 DEBUG nova.network.neutron [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Successfully updated port: 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:16:30 np0005474864 nova_compute[192593]: 2025-10-07 20:16:30.481 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:30 np0005474864 nova_compute[192593]: 2025-10-07 20:16:30.481 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquired lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:30 np0005474864 nova_compute[192593]: 2025-10-07 20:16:30.481 2 DEBUG nova.network.neutron [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:16:30 np0005474864 nova_compute[192593]: 2025-10-07 20:16:30.610 2 DEBUG nova.compute.manager [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-changed-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:30 np0005474864 nova_compute[192593]: 2025-10-07 20:16:30.610 2 DEBUG nova.compute.manager [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Refreshing instance network info cache due to event network-changed-44a32614-55d7-4ef1-a5fd-a40fcb2f1932. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:16:30 np0005474864 nova_compute[192593]: 2025-10-07 20:16:30.612 2 DEBUG oslo_concurrency.lockutils [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:31Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:a0:2b:b7 10.100.0.9
Oct  7 16:16:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:31Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:a0:2b:b7 10.100.0.9
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.257 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'name': 'tempest-TestGettingAddress-server-1139550974', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000019', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2f9bf744045540618c9980fd4a7694f5', 'user_id': '334f092941fc46c496c7def76b2cfe18', 'hostId': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.260 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'name': 'tempest-TestGettingAddress-server-1118891772', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001c', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2f9bf744045540618c9980fd4a7694f5', 'user_id': '334f092941fc46c496c7def76b2cfe18', 'hostId': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.265 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c491b943-fbbd-46e0-be8c-74a8c1378ab3 / tapc1d00195-4d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.266 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for c491b943-fbbd-46e0-be8c-74a8c1378ab3 / tapfb8c9e14-7c inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.266 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.packets volume: 30 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.267 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.packets volume: 17 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.271 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d37e7dbd-01fd-4484-9bdb-f09d24420fa7 / tapb88a1e01-9e inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.271 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d37e7dbd-01fd-4484-9bdb-f09d24420fa7 / tap9315ca92-32 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.272 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.272 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.packets volume: 10 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df0b68c8-3c6f-4892-b0ad-792ed6caa112', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 30, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.261659', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '836cc02c-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '8d73e4716b649b9981490e1a4456856c9c2fdfd982e546e0b85be93cb7a04c8e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 17, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.261659', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '836cd742-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '6f94bda13bf637b70721b5479d2d60eb86e366803bf40de99caaf92befbea9db'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.261659', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '836d91c8-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '0d24f71cc6266ecc55e6f48dc517d6c70354f87fc35b2451896e1b305ddc5813'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 10, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.261659', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '836da708-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'bd75e92443f19fd563e00e85dc5fea06801073b9cb1c2b0151eb5423d6e5f069'}]}, 'timestamp': '2025-10-07 20:16:31.273431', '_unique_id': '2948f5161e044ff89cfa70b2d991b22f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.275 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.277 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  7 16:16:31 np0005474864 nova_compute[192593]: 2025-10-07 20:16:31.293 2 DEBUG nova.network.neutron [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.295 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.296 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.309 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.310 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea181cb6-600d-454b-9fa7-a7ec5e60cc2f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.277564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83712a22-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.227409033, 'message_signature': 'e1db300fd7d1eaf2f49576be6369a88d7ba867200e7e06d65651f2ab35365a87'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.277564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83713f9e-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.227409033, 'message_signature': '2e9b8e87c54a84245b55c9f2c31d57194500e7e0246c5e49037258d8519be9e1'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.277564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83734668-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.246713465, 'message_signature': '4611c4bf3a640dcf991204d81cf8e51f3d5c891c80d05d20f66b62763e3b5997'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.277564', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83735e6e-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.246713465, 'message_signature': '4e1b303a3e2dcbeb6cd7f8fb49a33aaaafa4f0e9fc22b3b427d9db80a36a59b8'}]}, 'timestamp': '2025-10-07 20:16:31.310830', '_unique_id': 'bb8a281a194d4414ae41ab317304941b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.312 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.313 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.313 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.314 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.315 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.315 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '85a92025-eb9f-4a1a-a258-5f065e7a57c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.313918', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '8373eb54-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '6f90000e2058360b42f0dcc3f593bddc5c59a751a3102fb70f39d128a187c97e'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.313918', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '8374030a-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': 'c9c6761892a6075c0831c0baa644112e495372765dd3ade1e0e34e46e052b156'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.313918', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '8374170a-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'b7decb6f7df050069fb90b9fb51586a37b4fa25a06d25cb376fb8d51b6b09368'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.313918', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '837428da-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'aa4ee3b2529fecb5dcb5571cbc9eed61f70c1102b5086ede153a1695b81e6aa5'}]}, 'timestamp': '2025-10-07 20:16:31.316007', '_unique_id': '4a55121e7f5541c48723616699f63762'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.317 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.318 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.345 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/cpu volume: 10880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.371 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/cpu volume: 11010000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a6da2c6f-7471-4804-b242-1840799b7d60', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10880000000, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'timestamp': '2025-10-07T20:16:31.319113', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8378c9e4-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.295257624, 'message_signature': 'a919c86f3f279ca41ee8d659d0625c433bfe0b5291f9650f2dd17977136201f0'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11010000000, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'timestamp': '2025-10-07T20:16:31.319113', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '837cb5fe-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.320996961, 'message_signature': '9caf7c630736b2cb44746fd11c70e82ec2b03fe44603a4f94970183789b51a72'}]}, 'timestamp': '2025-10-07 20:16:31.372142', '_unique_id': 'f64a957fcf3d4e38bcf5627d6a425f6d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.373 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.375 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.412 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.read.latency volume: 554794433 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.412 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.read.latency volume: 131481494 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.444 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.read.latency volume: 651493552 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.445 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.read.latency volume: 112906781 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b745366c-9b57-41b1-82d2-70e016392692', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 554794433, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.375541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8382ee92-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '1305de959d1a996c9e0116640018828b0a3121059f087bad1680c95655443ddb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 131481494, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.375541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83830af8-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': 'e226be0195093d5877849de8861bc3c6051a8e1940f3e1def0fcd71a89834cdd'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 651493552, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.375541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8387e4d8-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': 'a8f9c07d38667cf21701b48e5d78a77448fdf9b6881606f21135248cc4c2e345'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 112906781, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.375541', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8387f4b4-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': '335cf23e8123521d7b1b64cd0f7db46521e2ea3af2044db8940c306e73019835'}]}, 'timestamp': '2025-10-07 20:16:31.445667', '_unique_id': 'fa65474d5bd546c796bf8042887e6bce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.446 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.447 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.447 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.448 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>]
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.448 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.448 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/memory.usage volume: 43.69921875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.448 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/memory.usage volume: 40.40625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88a6e9dc-50d7-4fb3-a4d5-98bf794db480', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 43.69921875, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'timestamp': '2025-10-07T20:16:31.448429', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '83886d22-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.295257624, 'message_signature': 'e3cbda05a8734ed133da779e235d4929ded6c06712057077c1b80c5158277d6d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.40625, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'timestamp': '2025-10-07T20:16:31.448429', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '83887bb4-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.320996961, 'message_signature': '5fecbff3319c8d552a2dc19505b47c486804a0a7acfdd98708c672aff7db286d'}]}, 'timestamp': '2025-10-07 20:16:31.449111', '_unique_id': 'f72e8f9b752c4eefa831bc24b4b3c92a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.449 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.450 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.451 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.write.requests volume: 337 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.451 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.451 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.write.requests volume: 281 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.452 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '296e166e-d803-4e54-91c5-7eb5a30b3929', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 337, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.451056', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8388d6cc-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': 'c6533c5afc7d7b6dcd8e93f1d6b2d4325828aae95d6899bcdf58699a8daf33e3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.451056', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8388e4fa-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '0843d3d9d5c7643d7966360ead9a3af55206e5eead42eb73732278b045b43836'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 281, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.451056', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '8388f332-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': 'f493b8742f1737e5b6e788a4608a9b62959cf5dfaa8049d0c6dc9b66d4937894'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.451056', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83890232-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': '79beafc080fac0cbc32061770807c9d4112f4e10aff14709f5e1493283fa1a29'}]}, 'timestamp': '2025-10-07 20:16:31.452553', '_unique_id': '4a303a8d41134f11844b219d6009192e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.453 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.454 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.454 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.454 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.455 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.455 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6668d325-faa7-4d7b-af8e-ac152aabbd0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.454402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '838956ce-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '333388d3a16171b078910725655219500077734e6e8f205be9325777c7cc1770'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.454402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '83896394-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': 'ce3cecf8a507fec9dd90387df734350a12899e7e6f1bf20f9dd71a6e5baed130'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.454402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '83896eb6-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'c55683a562b6ca07e1a4e283c20f9ee66925bcfcc23391ee10e9f99c401311d3'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.454402', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '83897cf8-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '35cd0f62346e065e94d2a6c6d8c6ea114be518253d62caa127f9f30f49a84ff6'}]}, 'timestamp': '2025-10-07 20:16:31.455707', '_unique_id': '435d025aa7eb4a8589dd95622ec005ab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.456 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.457 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.457 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.bytes volume: 3456 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.457 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.bytes volume: 2962 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.457 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.bytes volume: 992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.458 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.bytes volume: 948 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ac53f93-f591-4a0e-b07a-1e8f8b22c467', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3456, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.457306', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '8389c7b2-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '28b85261771666dfd5f77d6954e0c80bafa617cbc3fc19d85bd84991c91ce9ab'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2962, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.457306', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '8389d27a-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '30a9e9d286f258ce5ef8e0191d82c568c421eb299844672583aa63f91b08d6fb'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 992, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.457306', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '8389dcc0-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'd740a8eec6f89527278ccfb801f9aae0e1650a197b64f3fe750f202051d94f37'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 948, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.457306', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '8389e940-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '79766284016fe5320ff292b6bdb3ff322d619e4025e1863b15e7f36d6b0ba899'}]}, 'timestamp': '2025-10-07 20:16:31.458479', '_unique_id': '21ae5533e75b4c7380e76836dde6bab9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.459 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.460 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.bytes volume: 4579 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.460 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.bytes volume: 1650 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.460 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.bytes volume: 1868 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.460 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.bytes volume: 1024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd934fd82-d9ef-452b-88ed-13202ad1a33c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4579, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.460089', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '838a3526-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '6dd645c271fb0c788047578f6e9ac9573e516faee5bb60dbfea8286cca53a09d'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1650, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.460089', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '838a4098-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': 'e45ee6d0d20a67d8b809597038eb857caf1c6e361e5f3b0101468991cd285a28'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1868, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.460089', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '838a4b4c-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '0ef947e8e6d73be8e7eccd7f36dfaf242961b0563459acf3f22960769737292f'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1024, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.460089', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '838a55ce-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '99f44413cbd7b49a476b597c518390be77b54d4c6b9ceb012894900e8698f6a3'}]}, 'timestamp': '2025-10-07 20:16:31.461294', '_unique_id': '8b14d0679cd44aca9ab74938e1df6e34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.461 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.462 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.462 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.allocation volume: 30351360 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.463 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.463 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.allocation volume: 30482432 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.463 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc75b8b6-2d0a-4eac-851b-ccb4c4fb68c7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30351360, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.462891', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838aa150-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.227409033, 'message_signature': '13cd99a95567baa0e6a8e222ca4d71f3d72a56f12175ee98beeec1244dbeab9d'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.462891', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838aad62-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.227409033, 'message_signature': '1c2f73355d1bb43ba7a98786cf5dbdf31f7345419c693d544347b79597b8f4b8'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30482432, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.462891', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838ab834-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.246713465, 'message_signature': '9d707ec5a0ea68e7d11bac3dac7b3fdfef6f3656eaf6b6c2c792fe98d348a25c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.462891', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838ac20c-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.246713465, 'message_signature': '313174c7ed04db6d9916fc1bfbead4556e00f3f3734de474e3247566ff234f8d'}]}, 'timestamp': '2025-10-07 20:16:31.464012', '_unique_id': '84a37ffa46764a64a6bfb688ea536884'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.464 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.465 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.465 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.packets volume: 29 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.465 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.packets volume: 25 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.466 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.packets volume: 6 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.466 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94abaf06-8d2b-40e2-ae52-b5101818957d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 29, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.465575', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '838b0a32-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '915e3264fc79af5e609f7c4e7401bbd694de802a20cf6b24e3f07f0cad487a4b'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 25, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.465575', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '838b150e-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': 'bd1a62362ee33f9eb4b39e5320d3b6f49de857e9b00b90f1a84fdd886db86835'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 6, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.465575', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '838b20bc-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'c56ac90819e220911c1e86ebd866508fd956aeb240ea4a3915ee12b466afb4ec'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.465575', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '838b2bc0-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '9e82493fcc26f807113f1e5c19b8ba5d20b71715dbb4d12ebdcfc6e605b04b01'}]}, 'timestamp': '2025-10-07 20:16:31.466758', '_unique_id': '6d11a18f7993489ca3ea52d695efb0f6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.467 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.468 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.468 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.468 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.468 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.469 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '236cf522-7c85-4d3a-b8a2-f5e0cec6c550', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.468329', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '838b75d0-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '87d172c572e31781d4625e31c4832a9d0331be98a7759eba9a8cd74bb846367d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.468329', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '838b8098-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '36768f2769020071156b4eb83f8c538378ac65c9e68b5e5b51d31976d92af64a'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.468329', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '838b8b4c-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '3391c7c908fdce540ff63809a17807f2b3bf1f82eb79c1cd94f0401af65e729d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.468329', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '838b96be-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '1b1637b6b55738c1ed88f86d4e9647416d625b2f08b57a866c691d4d71b9747d'}]}, 'timestamp': '2025-10-07 20:16:31.469465', '_unique_id': 'f47db7985b374aa99be3992cc270150a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.470 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>]
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>]
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.471 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1139550974>, <NovaLikeServer: tempest-TestGettingAddress-server-1118891772>]
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.472 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.472 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.472 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.472 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.473 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8d6d3c5c-a098-4a02-841c-a4bee6d0b162', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.472228', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '838c0f36-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': 'c31afb9d0f606c795ffe5dc28e3b02bd88c10686a8845ab3dc018c4d6db6d748'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.472228', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '838c1aa8-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': 'c25d6205203b58b788c36d310ae86110219d1bbdca07e04ea99435db6b6997fb'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.472228', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '838c2548-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '10831ebe1ebd77c46e21da4bcff1373b91805582d54160a8d9dbcff86bc6dfe1'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.472228', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '838c3092-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'f26f98854d68f2a640bdd926166baabe31d581e7b6a198715b7305d693ef5b28'}]}, 'timestamp': '2025-10-07 20:16:31.473405', '_unique_id': '190a55b2598c4aa7a4b2daa3bfbe62d0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.474 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.475 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.write.latency volume: 3297551506 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.475 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.475 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.write.latency volume: 33715004067 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.475 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '70a73111-a18d-4a9e-8185-098cf8c9f504', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3297551506, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.474989', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838c79da-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '4ff049a36ac185ea2dc2eb732cf31b89818978f30cf57ebd210f76458c64d3ee'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.474989', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838c8588-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': 'f4162816a00c937b5ee3ee23ea5616b98058e307d44f77732bbd2da9424467fa'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 33715004067, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.474989', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838c8fec-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': '389393ab8ef960db96f483099a7c348f46ab940603ef02502ef8de6932c6e490'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.474989', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838c9a00-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': 'c3fdd70282cc602eb77e29bac757c646e3bde41ea8245370ca10c42bf5dcc64a'}]}, 'timestamp': '2025-10-07 20:16:31.476094', '_unique_id': '77c79037296e4473bd81220b45d5d077'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.476 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.477 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.477 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.477 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.478 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.478 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'acaba583-7753-4932-bf7f-06c0c1e18bff', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.477655', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '838ce334-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '640f629f82fa711ce724385a79cd5f8b5eae92580aaf931a8adc5c4e3ebe40ca'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.477655', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '838cee42-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '09aea4ef10cc2a97cd51d3906e5125824fdbce86f0cc9123b2f5eeda1d8ad7c8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.477655', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '838cfb80-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '00016e25e1207abcd462710bd8a77930b98c219d40cef372e71d6cca29b95bfd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.477655', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '838d0a9e-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'd52a31b18e51c87d0312456fa9609ec8c260130f23aa7f43aa4b975a63135494'}]}, 'timestamp': '2025-10-07 20:16:31.479035', '_unique_id': '29457af85c1a499a88f00255717de8cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.479 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.480 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.480 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.481 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.481 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.481 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8350da91-3a2c-4a18-b420-c0c707d38aad', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.480803', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838d5de6-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.227409033, 'message_signature': '494542afb529a76c1729b43aa767747d635cc3717080f3537767d283da3b4667'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.480803', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838d6944-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.227409033, 'message_signature': '95ab21a72c088730ebcd4a9c78cd0fd256c26901cdad2ba65082ec3f1de49ac3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.480803', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838d73ee-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.246713465, 'message_signature': '4b96083041ac0bebd3b7b07cd398071f2de0ec321e7c1887aae7c0e9d2e7424d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.480803', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838d7e16-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.246713465, 'message_signature': 'd44dca759b8572bd95633ead3dfcf762d25010ba3caa146556b80ed5d24702fb'}]}, 'timestamp': '2025-10-07 20:16:31.481931', '_unique_id': 'f2ed9aa1d7e74a18a1fbdf4aa382f83d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.482 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.483 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.483 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.write.bytes volume: 73089024 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.484 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.484 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.write.bytes volume: 72769536 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.484 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '536fdf67-4853-48e4-a32b-3392181b59b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 73089024, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.483776', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838dd28a-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '88f05d57d06fe4e8a3f78db2c1219afa1daa2d62f36c162eeb467849381446bb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.483776', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838de11c-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '58837cb9eed4890eb7f9912d187a84ce99473caa011d8b4cc4c057847c353d53'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72769536, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.483776', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838def54-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': 'b5e24f540a479a4bfdfd6afb3286565610fe49e80d74dabfd8def1b62f20e64e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.483776', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838dfda0-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': 'fd1f17cdd8c559cab2e73f1557d825fde0182874b33fa51c0176dafdd16d6bc2'}]}, 'timestamp': '2025-10-07 20:16:31.485269', '_unique_id': '78555485c6094870902447dbb6d9a275'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.486 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.read.bytes volume: 30722560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.487 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.487 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.read.bytes volume: 30960128 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.487 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d2ea02a-9f7c-4610-b1fe-ad300dc07ad8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30722560, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.486905', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838e4b3e-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '135ba6e2d99e7882efe0c8e19937f0a75ea272f37b93b3ddf1f8db44135aac8b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.486905', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838e5728-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '182091d861f09a16027ed2d440d86919ae507d4eea35720fe762c61785d46be4'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30960128, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.486905', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838e6132-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': '269520cc4759abcea2ab2ad2562f04a4d4f53bafc084f932c6472e8067f30a13'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.486905', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838e6b28-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': '1951a1c8de66555cf35529269ab990b3b7b83adbecf651ff2059612de6de9a65'}]}, 'timestamp': '2025-10-07 20:16:31.487999', '_unique_id': '5ba450207c3544f59fbb47f62def03f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.488 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.489 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.489 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.489 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.490 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.490 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a293b3bd-77c4-4e08-becf-47aceff56c44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapc1d00195-4d', 'timestamp': '2025-10-07T20:16:31.489552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapc1d00195-4d', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e1:c8:d3', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapc1d00195-4d'}, 'message_id': '838eb2ae-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '701888c74522cb2c5254a0bdfce63c6c8112fdf846d088b10cc51241d46189c4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000019-c491b943-fbbd-46e0-be8c-74a8c1378ab3-tapfb8c9e14-7c', 'timestamp': '2025-10-07T20:16:31.489552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'tapfb8c9e14-7c', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:24:aa:7e', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapfb8c9e14-7c'}, 'message_id': '838ebd30-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.211534549, 'message_signature': '5fe895628b96dfa21bf26c290aaa5be6c23eb6b141ce35df5d7131114282a5e1'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tapb88a1e01-9e', 'timestamp': '2025-10-07T20:16:31.489552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tapb88a1e01-9e', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:a0:2b:b7', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapb88a1e01-9e'}, 'message_id': '838ec8fc-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': 'c410f63c91f6becde9e6123de0462fbbb0441a11122a7082a8f84f9e596c2b21'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-0000001c-d37e7dbd-01fd-4484-9bdb-f09d24420fa7-tap9315ca92-32', 'timestamp': '2025-10-07T20:16:31.489552', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'tap9315ca92-32', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:72:12:14', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9315ca92-32'}, 'message_id': '838ed43c-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.2178851, 'message_signature': '03d0f03afe68ea6b5042d0fa8bbd75bc7989178f036c585ff3b5b9658cf2e105'}]}, 'timestamp': '2025-10-07 20:16:31.490729', '_unique_id': '459fd4969b294ab789d77661db6fce7b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.491 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.492 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.492 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.read.requests volume: 1104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.492 12 DEBUG ceilometer.compute.pollsters [-] c491b943-fbbd-46e0-be8c-74a8c1378ab3/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.492 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.read.requests volume: 1132 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.493 12 DEBUG ceilometer.compute.pollsters [-] d37e7dbd-01fd-4484-9bdb-f09d24420fa7/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd2c9456a-de88-440b-890d-d555b7232663', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1104, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-vda', 'timestamp': '2025-10-07T20:16:31.492292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838f1dd4-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': 'aa35bbc804bd2eb2b47e109e07e804ffb36653f66a98f291c3c48dcb42ab8116'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3-sda', 'timestamp': '2025-10-07T20:16:31.492292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1139550974', 'name': 'instance-00000019', 'instance_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838f27f2-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.325396927, 'message_signature': '39ea70fbf10eba7e630a899badb535d9a2c76146e6d4447fbd765e229efa4c9b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1132, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-vda', 'timestamp': '2025-10-07T20:16:31.492292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838f3206-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': 'ea5420f2b43896bca959065f2e6f02ae12d16e60a41c29d0f3b2a774559e6af8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7-sda', 'timestamp': '2025-10-07T20:16:31.492292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1118891772', 'name': 'instance-0000001c', 'instance_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838f3c38-a3ba-11f0-9441-fa163e5cce8e', 'monotonic_time': 3827.363467356, 'message_signature': '424ce6bfb67773adb4caf90694480c75739f99decff75926f80cc358113c825b'}]}, 'timestamp': '2025-10-07 20:16:31.493385', '_unique_id': 'f4647225c136463d9a35b5fcb9886dd2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:16:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:16:31.494 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.240 2 DEBUG nova.network.neutron [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updating instance_info_cache with network_info: [{"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.267 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Releasing lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.268 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Instance network_info: |[{"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.268 2 DEBUG oslo_concurrency.lockutils [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.268 2 DEBUG nova.network.neutron [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Refreshing network info cache for port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.271 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Start _get_guest_xml network_info=[{"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.277 2 WARNING nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.284 2 DEBUG nova.virt.libvirt.host [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.285 2 DEBUG nova.virt.libvirt.host [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.291 2 DEBUG nova.virt.libvirt.host [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.291 2 DEBUG nova.virt.libvirt.host [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.293 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.293 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.293 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.294 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.294 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.294 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.294 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.295 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.295 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.295 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.295 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.295 2 DEBUG nova.virt.hardware [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.298 2 DEBUG nova.virt.libvirt.vif [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:16:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-gen-1-1810787220',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-gen-1-1810787220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1464343180-ge',id=30,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLvgCclNCSnL6bAl88RaoE/mDuz9vzVZIyOQ9372aCj0lSBodfF+wIreifgZR5TdbY8cqOUBnLcIsW2x52Tz52LQw3S/GAc8nHVmoD/mlmP4GsIo7dPUAD2amfmN9ntUXQ==',key_name='tempest-TestSecurityGroupsBasicOps-1421588277',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b69ac5dc2b44912af0aa0671c7e3696',ramdisk_id='',reservation_id='r-om90mukt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1464343180',owner_user_name='tempest-TestSecurityGroupsBasicOps-1464343180-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:16:26Z,user_data=None,user_id='ab16a639b2af44c7bc4218a1b1b91068',uuid=01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.299 2 DEBUG nova.network.os_vif_util [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converting VIF {"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.299 2 DEBUG nova.network.os_vif_util [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:60:4d,bridge_name='br-int',has_traffic_filtering=True,id=44a32614-55d7-4ef1-a5fd-a40fcb2f1932,network=Network(76a6eb6c-a532-47d0-908b-b56f18cd0dee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44a32614-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.300 2 DEBUG nova.objects.instance [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lazy-loading 'pci_devices' on Instance uuid 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.324 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <uuid>01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc</uuid>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <name>instance-0000001e</name>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-gen-1-1810787220</nova:name>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:16:32</nova:creationTime>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:user uuid="ab16a639b2af44c7bc4218a1b1b91068">tempest-TestSecurityGroupsBasicOps-1464343180-project-member</nova:user>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:project uuid="4b69ac5dc2b44912af0aa0671c7e3696">tempest-TestSecurityGroupsBasicOps-1464343180</nova:project>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        <nova:port uuid="44a32614-55d7-4ef1-a5fd-a40fcb2f1932">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <entry name="serial">01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc</entry>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <entry name="uuid">01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc</entry>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk.config"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:7f:60:4d"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <target dev="tap44a32614-55"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/console.log" append="off"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:16:32 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:16:32 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:16:32 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:16:32 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.325 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Preparing to wait for external event network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.325 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.325 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.325 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.326 2 DEBUG nova.virt.libvirt.vif [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:16:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-gen-1-1810787220',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-gen-1-1810787220',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1464343180-ge',id=30,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLvgCclNCSnL6bAl88RaoE/mDuz9vzVZIyOQ9372aCj0lSBodfF+wIreifgZR5TdbY8cqOUBnLcIsW2x52Tz52LQw3S/GAc8nHVmoD/mlmP4GsIo7dPUAD2amfmN9ntUXQ==',key_name='tempest-TestSecurityGroupsBasicOps-1421588277',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b69ac5dc2b44912af0aa0671c7e3696',ramdisk_id='',reservation_id='r-om90mukt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1464343180',owner_user_name='tempest-TestSecurityGroupsBasicOps-1464343180-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:16:26Z,user_data=None,user_id='ab16a639b2af44c7bc4218a1b1b91068',uuid=01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.326 2 DEBUG nova.network.os_vif_util [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converting VIF {"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.327 2 DEBUG nova.network.os_vif_util [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:7f:60:4d,bridge_name='br-int',has_traffic_filtering=True,id=44a32614-55d7-4ef1-a5fd-a40fcb2f1932,network=Network(76a6eb6c-a532-47d0-908b-b56f18cd0dee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44a32614-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.327 2 DEBUG os_vif [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:60:4d,bridge_name='br-int',has_traffic_filtering=True,id=44a32614-55d7-4ef1-a5fd-a40fcb2f1932,network=Network(76a6eb6c-a532-47d0-908b-b56f18cd0dee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44a32614-55') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.328 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.328 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.333 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap44a32614-55, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.333 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap44a32614-55, col_values=(('external_ids', {'iface-id': '44a32614-55d7-4ef1-a5fd-a40fcb2f1932', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:7f:60:4d', 'vm-uuid': '01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:32 np0005474864 NetworkManager[51631]: <info>  [1759868192.3396] manager: (tap44a32614-55): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/83)
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.341 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.347 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.348 2 INFO os_vif [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:7f:60:4d,bridge_name='br-int',has_traffic_filtering=True,id=44a32614-55d7-4ef1-a5fd-a40fcb2f1932,network=Network(76a6eb6c-a532-47d0-908b-b56f18cd0dee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44a32614-55')#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.414 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.414 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.414 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] No VIF found with MAC fa:16:3e:7f:60:4d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.415 2 INFO nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Using config drive#033[00m
Oct  7 16:16:32 np0005474864 podman[224873]: 2025-10-07 20:16:32.420794627 +0000 UTC m=+0.100340002 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:16:32 np0005474864 podman[224874]: 2025-10-07 20:16:32.45479374 +0000 UTC m=+0.130471324 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, architecture=x86_64, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, io.buildah.version=1.33.7, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  7 16:16:32 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:32Z|00153|binding|INFO|Releasing lport 3fba40e3-39d9-4871-b9b9-3e2e5088af4f from this chassis (sb_readonly=0)
Oct  7 16:16:32 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:32Z|00154|binding|INFO|Releasing lport 5cf38e83-5f07-4562-b663-4850a1d35f81 from this chassis (sb_readonly=0)
Oct  7 16:16:32 np0005474864 nova_compute[192593]: 2025-10-07 20:16:32.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:33 np0005474864 nova_compute[192593]: 2025-10-07 20:16:33.497 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:33 np0005474864 nova_compute[192593]: 2025-10-07 20:16:33.565 2 INFO nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Creating config drive at /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk.config#033[00m
Oct  7 16:16:33 np0005474864 nova_compute[192593]: 2025-10-07 20:16:33.570 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wd83pus execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:16:33 np0005474864 nova_compute[192593]: 2025-10-07 20:16:33.710 2 DEBUG oslo_concurrency.processutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1wd83pus" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:16:33 np0005474864 kernel: tap44a32614-55: entered promiscuous mode
Oct  7 16:16:33 np0005474864 NetworkManager[51631]: <info>  [1759868193.7759] manager: (tap44a32614-55): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Oct  7 16:16:33 np0005474864 nova_compute[192593]: 2025-10-07 20:16:33.779 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:33Z|00155|binding|INFO|Claiming lport 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 for this chassis.
Oct  7 16:16:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:33Z|00156|binding|INFO|44a32614-55d7-4ef1-a5fd-a40fcb2f1932: Claiming fa:16:3e:7f:60:4d 10.100.0.14
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.815 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:60:4d 10.100.0.14'], port_security=['fa:16:3e:7f:60:4d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b69ac5dc2b44912af0aa0671c7e3696', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bfca1ade-54c0-4867-81fa-6415544ed64d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a35b5ca-173b-49c1-b4c9-9509d16f2f82, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=44a32614-55d7-4ef1-a5fd-a40fcb2f1932) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.816 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 in datapath 76a6eb6c-a532-47d0-908b-b56f18cd0dee bound to our chassis#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.818 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 76a6eb6c-a532-47d0-908b-b56f18cd0dee#033[00m
Oct  7 16:16:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:33Z|00157|binding|INFO|Setting lport 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 ovn-installed in OVS
Oct  7 16:16:33 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:33Z|00158|binding|INFO|Setting lport 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 up in Southbound
Oct  7 16:16:33 np0005474864 nova_compute[192593]: 2025-10-07 20:16:33.829 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.834 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[22406bce-2ee8-4e9a-b38a-0852000537fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.836 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap76a6eb6c-a1 in ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:16:33 np0005474864 systemd-udevd[224940]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:16:33 np0005474864 systemd-machined[152586]: New machine qemu-10-instance-0000001e.
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.839 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap76a6eb6c-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.839 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5231c70a-8995-47bb-b61e-7ac137e10bed]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.841 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[69a4a926-5f91-4d92-a279-3bb57477d190]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:33 np0005474864 NetworkManager[51631]: <info>  [1759868193.8576] device (tap44a32614-55): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:16:33 np0005474864 NetworkManager[51631]: <info>  [1759868193.8590] device (tap44a32614-55): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:16:33 np0005474864 systemd[1]: Started Virtual Machine qemu-10-instance-0000001e.
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.861 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[a6c76a5b-2e39-4d8c-b990-2a992beafe75]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.893 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2dd32253-1201-4574-8af6-3b65db3160be]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.949 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[29626cf7-eb0a-4b9d-9ccc-ec9d74a5e32b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:33 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:33.957 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[86729b0f-03e4-439b-841b-c6d68f653c6a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:33 np0005474864 NetworkManager[51631]: <info>  [1759868193.9593] manager: (tap76a6eb6c-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Oct  7 16:16:33 np0005474864 systemd-udevd[224943]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.005 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[4aededfc-4ad9-4e17-afd2-1c785f35ad47]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.009 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[8ac45c69-84ea-41b4-b01f-ab7d07ca42f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.026 2 DEBUG nova.compute.manager [req-9f6da41f-0c81-4935-99e9-b03f2fc43a2a req-a0c3c51a-f688-4cdc-874e-75349804ae76 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.027 2 DEBUG oslo_concurrency.lockutils [req-9f6da41f-0c81-4935-99e9-b03f2fc43a2a req-a0c3c51a-f688-4cdc-874e-75349804ae76 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.027 2 DEBUG oslo_concurrency.lockutils [req-9f6da41f-0c81-4935-99e9-b03f2fc43a2a req-a0c3c51a-f688-4cdc-874e-75349804ae76 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.027 2 DEBUG oslo_concurrency.lockutils [req-9f6da41f-0c81-4935-99e9-b03f2fc43a2a req-a0c3c51a-f688-4cdc-874e-75349804ae76 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.028 2 DEBUG nova.compute.manager [req-9f6da41f-0c81-4935-99e9-b03f2fc43a2a req-a0c3c51a-f688-4cdc-874e-75349804ae76 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Processing event network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:16:34 np0005474864 NetworkManager[51631]: <info>  [1759868194.0484] device (tap76a6eb6c-a0): carrier: link connected
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.056 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[453f230e-1c74-4d18-95f7-7f4a2d84e74d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.081 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[734fce12-9138-4a62-96bd-0fa344875c10]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76a6eb6c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:72:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382994, 'reachable_time': 39735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224972, 'error': None, 'target': 'ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.108 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3ea29bd3-1aec-4331-88df-fd33dceb9fbe]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea6:7208'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 382994, 'tstamp': 382994}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224973, 'error': None, 'target': 'ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.128 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8b38629e-651d-426a-ab01-5e9c765c124c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap76a6eb6c-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a6:72:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 220, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382994, 'reachable_time': 39735, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 192, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 192, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224975, 'error': None, 'target': 'ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.163 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a7cbd955-ec6c-4e44-ba53-cb2f9a341c0c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.236 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[51a0676b-ed76-490a-8405-7b87f7ab2ac1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.238 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76a6eb6c-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.238 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.239 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap76a6eb6c-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.241 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:34 np0005474864 kernel: tap76a6eb6c-a0: entered promiscuous mode
Oct  7 16:16:34 np0005474864 NetworkManager[51631]: <info>  [1759868194.2432] manager: (tap76a6eb6c-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.245 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap76a6eb6c-a0, col_values=(('external_ids', {'iface-id': 'bdb86fc3-e81b-4f35-8229-6aeee90a663b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:34 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:34Z|00159|binding|INFO|Releasing lport bdb86fc3-e81b-4f35-8229-6aeee90a663b from this chassis (sb_readonly=0)
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.248 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/76a6eb6c-a532-47d0-908b-b56f18cd0dee.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/76a6eb6c-a532-47d0-908b-b56f18cd0dee.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.249 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[56811d72-84bb-4d67-84c7-10cd3668e59e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.250 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-76a6eb6c-a532-47d0-908b-b56f18cd0dee
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/76a6eb6c-a532-47d0-908b-b56f18cd0dee.pid.haproxy
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 76a6eb6c-a532-47d0-908b-b56f18cd0dee
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:16:34 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:34.251 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'env', 'PROCESS_TAG=haproxy-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/76a6eb6c-a532-47d0-908b-b56f18cd0dee.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.418 2 DEBUG nova.network.neutron [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updated VIF entry in instance network info cache for port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.419 2 DEBUG nova.network.neutron [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updating instance_info_cache with network_info: [{"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.441 2 DEBUG oslo_concurrency.lockutils [req-966abc68-c37f-4972-a211-d47abdd03318 req-43f30327-707a-41e4-9eec-4f7dfe5cce35 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:34 np0005474864 podman[225013]: 2025-10-07 20:16:34.632629746 +0000 UTC m=+0.059996558 container create f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2)
Oct  7 16:16:34 np0005474864 systemd[1]: Started libpod-conmon-f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3.scope.
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.670 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.672 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868194.6698208, 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.672 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] VM Started (Lifecycle Event)#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.675 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.678 2 INFO nova.virt.libvirt.driver [-] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Instance spawned successfully.#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.679 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:16:34 np0005474864 podman[225013]: 2025-10-07 20:16:34.602989318 +0000 UTC m=+0.030356170 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.705 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.710 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.713 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.714 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.714 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.714 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.715 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.715 2 DEBUG nova.virt.libvirt.driver [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:16:34 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:16:34 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/19c15a9cb6d42a16981b628139eb1f5a2f4d3aab86c2965d38de70620f15eb38/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:16:34 np0005474864 podman[225013]: 2025-10-07 20:16:34.738731862 +0000 UTC m=+0.166098684 container init f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.748 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.748 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868194.6743913, 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.749 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:16:34 np0005474864 podman[225013]: 2025-10-07 20:16:34.751100816 +0000 UTC m=+0.178467628 container start f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.778 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:34 np0005474864 neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee[225028]: [NOTICE]   (225032) : New worker (225034) forked
Oct  7 16:16:34 np0005474864 neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee[225028]: [NOTICE]   (225032) : Loading success.
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.783 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868194.674965, 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.783 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.796 2 INFO nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Took 7.69 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.796 2 DEBUG nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.808 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.811 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.829 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.860 2 INFO nova.compute.manager [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Took 8.33 seconds to build instance.#033[00m
Oct  7 16:16:34 np0005474864 nova_compute[192593]: 2025-10-07 20:16:34.876 2 DEBUG oslo_concurrency.lockutils [None req-56d2a8dd-754f-431d-aed7-50d5551fcb05 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.434s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:36 np0005474864 nova_compute[192593]: 2025-10-07 20:16:36.131 2 DEBUG nova.compute.manager [req-aa405643-bc4c-493e-8754-c48c171e8ea4 req-07af12ac-f733-459b-a8d4-5108b03045f7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:36 np0005474864 nova_compute[192593]: 2025-10-07 20:16:36.132 2 DEBUG oslo_concurrency.lockutils [req-aa405643-bc4c-493e-8754-c48c171e8ea4 req-07af12ac-f733-459b-a8d4-5108b03045f7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:36 np0005474864 nova_compute[192593]: 2025-10-07 20:16:36.132 2 DEBUG oslo_concurrency.lockutils [req-aa405643-bc4c-493e-8754-c48c171e8ea4 req-07af12ac-f733-459b-a8d4-5108b03045f7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:36 np0005474864 nova_compute[192593]: 2025-10-07 20:16:36.132 2 DEBUG oslo_concurrency.lockutils [req-aa405643-bc4c-493e-8754-c48c171e8ea4 req-07af12ac-f733-459b-a8d4-5108b03045f7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:36 np0005474864 nova_compute[192593]: 2025-10-07 20:16:36.133 2 DEBUG nova.compute.manager [req-aa405643-bc4c-493e-8754-c48c171e8ea4 req-07af12ac-f733-459b-a8d4-5108b03045f7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] No waiting events found dispatching network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:36 np0005474864 nova_compute[192593]: 2025-10-07 20:16:36.133 2 WARNING nova.compute.manager [req-aa405643-bc4c-493e-8754-c48c171e8ea4 req-07af12ac-f733-459b-a8d4-5108b03045f7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received unexpected event network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:16:37 np0005474864 nova_compute[192593]: 2025-10-07 20:16:37.338 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:38 np0005474864 podman[225043]: 2025-10-07 20:16:38.399721495 +0000 UTC m=+0.084736086 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  7 16:16:38 np0005474864 podman[225045]: 2025-10-07 20:16:38.43903308 +0000 UTC m=+0.098674605 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:16:38 np0005474864 podman[225044]: 2025-10-07 20:16:38.490607305 +0000 UTC m=+0.153778151 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:16:38 np0005474864 nova_compute[192593]: 2025-10-07 20:16:38.502 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:39 np0005474864 nova_compute[192593]: 2025-10-07 20:16:39.271 2 DEBUG nova.compute.manager [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-changed-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:39 np0005474864 nova_compute[192593]: 2025-10-07 20:16:39.272 2 DEBUG nova.compute.manager [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Refreshing instance network info cache due to event network-changed-44a32614-55d7-4ef1-a5fd-a40fcb2f1932. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:16:39 np0005474864 nova_compute[192593]: 2025-10-07 20:16:39.272 2 DEBUG oslo_concurrency.lockutils [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:39 np0005474864 nova_compute[192593]: 2025-10-07 20:16:39.272 2 DEBUG oslo_concurrency.lockutils [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:39 np0005474864 nova_compute[192593]: 2025-10-07 20:16:39.272 2 DEBUG nova.network.neutron [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Refreshing network info cache for port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:16:40 np0005474864 nova_compute[192593]: 2025-10-07 20:16:40.493 2 DEBUG nova.network.neutron [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updated VIF entry in instance network info cache for port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:16:40 np0005474864 nova_compute[192593]: 2025-10-07 20:16:40.495 2 DEBUG nova.network.neutron [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updating instance_info_cache with network_info: [{"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:40 np0005474864 nova_compute[192593]: 2025-10-07 20:16:40.520 2 DEBUG oslo_concurrency.lockutils [req-87bd9943-056c-46de-be22-4ec5eb053b1d req-048e3718-d6d4-4e43-8afa-3570278903b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:40 np0005474864 nova_compute[192593]: 2025-10-07 20:16:40.986 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:41 np0005474864 nova_compute[192593]: 2025-10-07 20:16:41.282 2 DEBUG nova.compute.manager [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-changed-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:41 np0005474864 nova_compute[192593]: 2025-10-07 20:16:41.283 2 DEBUG nova.compute.manager [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Refreshing instance network info cache due to event network-changed-44a32614-55d7-4ef1-a5fd-a40fcb2f1932. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:16:41 np0005474864 nova_compute[192593]: 2025-10-07 20:16:41.283 2 DEBUG oslo_concurrency.lockutils [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:41 np0005474864 nova_compute[192593]: 2025-10-07 20:16:41.284 2 DEBUG oslo_concurrency.lockutils [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:41 np0005474864 nova_compute[192593]: 2025-10-07 20:16:41.284 2 DEBUG nova.network.neutron [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Refreshing network info cache for port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:16:42 np0005474864 nova_compute[192593]: 2025-10-07 20:16:42.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:42 np0005474864 nova_compute[192593]: 2025-10-07 20:16:42.829 2 DEBUG nova.network.neutron [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updated VIF entry in instance network info cache for port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:16:42 np0005474864 nova_compute[192593]: 2025-10-07 20:16:42.830 2 DEBUG nova.network.neutron [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updating instance_info_cache with network_info: [{"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:42 np0005474864 nova_compute[192593]: 2025-10-07 20:16:42.845 2 DEBUG oslo_concurrency.lockutils [req-630f028c-21dd-42b1-a264-e94622f476f0 req-07bcb278-5005-427d-a38c-376704214ef0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:43 np0005474864 nova_compute[192593]: 2025-10-07 20:16:43.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:44 np0005474864 podman[225103]: 2025-10-07 20:16:44.370093977 +0000 UTC m=+0.062268513 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:16:45 np0005474864 nova_compute[192593]: 2025-10-07 20:16:45.672 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:46 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:46Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:7f:60:4d 10.100.0.14
Oct  7 16:16:46 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:46Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:7f:60:4d 10.100.0.14
Oct  7 16:16:47 np0005474864 nova_compute[192593]: 2025-10-07 20:16:47.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:47 np0005474864 podman[225133]: 2025-10-07 20:16:47.406521608 +0000 UTC m=+0.084173899 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:16:48 np0005474864 nova_compute[192593]: 2025-10-07 20:16:48.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:50 np0005474864 podman[225158]: 2025-10-07 20:16:50.406906089 +0000 UTC m=+0.099069665 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.458 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.459 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.460 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.460 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.461 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.463 2 INFO nova.compute.manager [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Terminating instance#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.465 2 DEBUG nova.compute.manager [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:16:52 np0005474864 kernel: tap44a32614-55 (unregistering): left promiscuous mode
Oct  7 16:16:52 np0005474864 NetworkManager[51631]: <info>  [1759868212.4946] device (tap44a32614-55): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:16:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:52Z|00160|binding|INFO|Releasing lport 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 from this chassis (sb_readonly=0)
Oct  7 16:16:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:52Z|00161|binding|INFO|Setting lport 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 down in Southbound
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.549 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:52Z|00162|binding|INFO|Removing iface tap44a32614-55 ovn-installed in OVS
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.551 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.558 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:7f:60:4d 10.100.0.14', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': '01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b69ac5dc2b44912af0aa0671c7e3696', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5a35b5ca-173b-49c1-b4c9-9509d16f2f82, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=44a32614-55d7-4ef1-a5fd-a40fcb2f1932) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.561 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 44a32614-55d7-4ef1-a5fd-a40fcb2f1932 in datapath 76a6eb6c-a532-47d0-908b-b56f18cd0dee unbound from our chassis#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.563 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 76a6eb6c-a532-47d0-908b-b56f18cd0dee, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.580 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a5ec1e86-fd1a-411d-a997-4dbb8a0dfe21]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.583 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee namespace which is not needed anymore#033[00m
Oct  7 16:16:52 np0005474864 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001e.scope: Deactivated successfully.
Oct  7 16:16:52 np0005474864 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d0000001e.scope: Consumed 12.445s CPU time.
Oct  7 16:16:52 np0005474864 systemd-machined[152586]: Machine qemu-10-instance-0000001e terminated.
Oct  7 16:16:52 np0005474864 neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee[225028]: [NOTICE]   (225032) : haproxy version is 2.8.14-c23fe91
Oct  7 16:16:52 np0005474864 neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee[225028]: [NOTICE]   (225032) : path to executable is /usr/sbin/haproxy
Oct  7 16:16:52 np0005474864 neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee[225028]: [WARNING]  (225032) : Exiting Master process...
Oct  7 16:16:52 np0005474864 neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee[225028]: [ALERT]    (225032) : Current worker (225034) exited with code 143 (Terminated)
Oct  7 16:16:52 np0005474864 neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee[225028]: [WARNING]  (225032) : All workers exited. Exiting... (0)
Oct  7 16:16:52 np0005474864 systemd[1]: libpod-f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3.scope: Deactivated successfully.
Oct  7 16:16:52 np0005474864 podman[225202]: 2025-10-07 20:16:52.765617941 +0000 UTC m=+0.060168423 container died f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.771 2 INFO nova.virt.libvirt.driver [-] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Instance destroyed successfully.#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.772 2 DEBUG nova.objects.instance [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lazy-loading 'resources' on Instance uuid 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.785 2 DEBUG nova.virt.libvirt.vif [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:16:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-gen-1-1810787220',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-gen-1-1810787220',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1464343180-ge',id=30,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLvgCclNCSnL6bAl88RaoE/mDuz9vzVZIyOQ9372aCj0lSBodfF+wIreifgZR5TdbY8cqOUBnLcIsW2x52Tz52LQw3S/GAc8nHVmoD/mlmP4GsIo7dPUAD2amfmN9ntUXQ==',key_name='tempest-TestSecurityGroupsBasicOps-1421588277',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:16:34Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b69ac5dc2b44912af0aa0671c7e3696',ramdisk_id='',reservation_id='r-om90mukt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1464343180',owner_user_name='tempest-TestSecurityGroupsBasicOps-1464343180-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:16:34Z,user_data=None,user_id='ab16a639b2af44c7bc4218a1b1b91068',uuid=01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.786 2 DEBUG nova.network.os_vif_util [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converting VIF {"id": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "address": "fa:16:3e:7f:60:4d", "network": {"id": "76a6eb6c-a532-47d0-908b-b56f18cd0dee", "bridge": "br-int", "label": "tempest-network-smoke--293453099", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap44a32614-55", "ovs_interfaceid": "44a32614-55d7-4ef1-a5fd-a40fcb2f1932", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.787 2 DEBUG nova.network.os_vif_util [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:7f:60:4d,bridge_name='br-int',has_traffic_filtering=True,id=44a32614-55d7-4ef1-a5fd-a40fcb2f1932,network=Network(76a6eb6c-a532-47d0-908b-b56f18cd0dee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44a32614-55') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.787 2 DEBUG os_vif [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:60:4d,bridge_name='br-int',has_traffic_filtering=True,id=44a32614-55d7-4ef1-a5fd-a40fcb2f1932,network=Network(76a6eb6c-a532-47d0-908b-b56f18cd0dee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44a32614-55') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.790 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap44a32614-55, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.793 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.795 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.799 2 INFO os_vif [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:7f:60:4d,bridge_name='br-int',has_traffic_filtering=True,id=44a32614-55d7-4ef1-a5fd-a40fcb2f1932,network=Network(76a6eb6c-a532-47d0-908b-b56f18cd0dee),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap44a32614-55')#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.799 2 INFO nova.virt.libvirt.driver [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Deleting instance files /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc_del#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.800 2 INFO nova.virt.libvirt.driver [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Deletion of /var/lib/nova/instances/01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc_del complete#033[00m
Oct  7 16:16:52 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3-userdata-shm.mount: Deactivated successfully.
Oct  7 16:16:52 np0005474864 systemd[1]: var-lib-containers-storage-overlay-19c15a9cb6d42a16981b628139eb1f5a2f4d3aab86c2965d38de70620f15eb38-merged.mount: Deactivated successfully.
Oct  7 16:16:52 np0005474864 podman[225202]: 2025-10-07 20:16:52.82779879 +0000 UTC m=+0.122349272 container cleanup f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.849 2 INFO nova.compute.manager [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:16:52 np0005474864 systemd[1]: libpod-conmon-f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3.scope: Deactivated successfully.
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.850 2 DEBUG oslo.service.loopingcall [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.850 2 DEBUG nova.compute.manager [-] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.850 2 DEBUG nova.network.neutron [-] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:16:52 np0005474864 podman[225250]: 2025-10-07 20:16:52.911180686 +0000 UTC m=+0.056583620 container remove f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.920 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0780f4e0-256a-43ce-a62d-fe8b300ddd02]: (4, ('Tue Oct  7 08:16:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee (f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3)\nf0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3\nTue Oct  7 08:16:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee (f0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3)\nf0c452add8436db8eac6d8d344b9c60ff7cfa257797a8fa137cca5f748f6afc3\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.923 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ab396aca-3bf1-4605-b37e-f503af656c92]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.925 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap76a6eb6c-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 kernel: tap76a6eb6c-a0: left promiscuous mode
Oct  7 16:16:52 np0005474864 nova_compute[192593]: 2025-10-07 20:16:52.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.957 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a7920e45-cdde-45ab-acd8-5e93552d93a8]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.996 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c39f8a10-72fc-4f4a-9d72-fb53604c8ce0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:52.999 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[27ebd21f-c80f-4ffb-94b8-d207b955c548]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:53.018 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0f79e060-ff6a-43b3-b861-072cfb5878ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 382983, 'reachable_time': 25020, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225265, 'error': None, 'target': 'ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:53.022 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-76a6eb6c-a532-47d0-908b-b56f18cd0dee deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:16:53 np0005474864 systemd[1]: run-netns-ovnmeta\x2d76a6eb6c\x2da532\x2d47d0\x2d908b\x2db56f18cd0dee.mount: Deactivated successfully.
Oct  7 16:16:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:53.022 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[26eb5df2-9d7e-4740-b6b3-86549d24e84e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.618 2 DEBUG nova.compute.manager [req-38746245-6e8c-49ac-8a65-07099941714c req-4b877093-d208-4860-8fec-5288b7df1656 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-vif-unplugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.619 2 DEBUG oslo_concurrency.lockutils [req-38746245-6e8c-49ac-8a65-07099941714c req-4b877093-d208-4860-8fec-5288b7df1656 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.619 2 DEBUG oslo_concurrency.lockutils [req-38746245-6e8c-49ac-8a65-07099941714c req-4b877093-d208-4860-8fec-5288b7df1656 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.619 2 DEBUG oslo_concurrency.lockutils [req-38746245-6e8c-49ac-8a65-07099941714c req-4b877093-d208-4860-8fec-5288b7df1656 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.620 2 DEBUG nova.compute.manager [req-38746245-6e8c-49ac-8a65-07099941714c req-4b877093-d208-4860-8fec-5288b7df1656 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] No waiting events found dispatching network-vif-unplugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.620 2 DEBUG nova.compute.manager [req-38746245-6e8c-49ac-8a65-07099941714c req-4b877093-d208-4860-8fec-5288b7df1656 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-vif-unplugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:16:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:53.767 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:53.772 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.859 2 DEBUG nova.network.neutron [-] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.876 2 INFO nova.compute.manager [-] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Took 1.03 seconds to deallocate network for instance.#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.925 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.926 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:53 np0005474864 nova_compute[192593]: 2025-10-07 20:16:53.931 2 DEBUG nova.compute.manager [req-76c7ec45-2f4a-45cb-8644-17444891f025 req-4c2f01f2-a335-465d-81bf-57d2a75356b0 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-vif-deleted-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:54 np0005474864 nova_compute[192593]: 2025-10-07 20:16:54.030 2 DEBUG nova.compute.provider_tree [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:16:54 np0005474864 nova_compute[192593]: 2025-10-07 20:16:54.044 2 DEBUG nova.scheduler.client.report [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:16:54 np0005474864 nova_compute[192593]: 2025-10-07 20:16:54.064 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:54 np0005474864 nova_compute[192593]: 2025-10-07 20:16:54.086 2 INFO nova.scheduler.client.report [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Deleted allocations for instance 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc#033[00m
Oct  7 16:16:54 np0005474864 nova_compute[192593]: 2025-10-07 20:16:54.142 2 DEBUG oslo_concurrency.lockutils [None req-b011a24b-0498-47af-86a3-92389f49e779 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.683s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:55 np0005474864 nova_compute[192593]: 2025-10-07 20:16:55.697 2 DEBUG nova.compute.manager [req-e5617870-bbd9-4ed7-ad14-b612fe2ea0be req-d1511136-093a-4697-9aa7-52316c03be54 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received event network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:55 np0005474864 nova_compute[192593]: 2025-10-07 20:16:55.698 2 DEBUG oslo_concurrency.lockutils [req-e5617870-bbd9-4ed7-ad14-b612fe2ea0be req-d1511136-093a-4697-9aa7-52316c03be54 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:55 np0005474864 nova_compute[192593]: 2025-10-07 20:16:55.699 2 DEBUG oslo_concurrency.lockutils [req-e5617870-bbd9-4ed7-ad14-b612fe2ea0be req-d1511136-093a-4697-9aa7-52316c03be54 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:55 np0005474864 nova_compute[192593]: 2025-10-07 20:16:55.699 2 DEBUG oslo_concurrency.lockutils [req-e5617870-bbd9-4ed7-ad14-b612fe2ea0be req-d1511136-093a-4697-9aa7-52316c03be54 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:55 np0005474864 nova_compute[192593]: 2025-10-07 20:16:55.699 2 DEBUG nova.compute.manager [req-e5617870-bbd9-4ed7-ad14-b612fe2ea0be req-d1511136-093a-4697-9aa7-52316c03be54 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] No waiting events found dispatching network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:55 np0005474864 nova_compute[192593]: 2025-10-07 20:16:55.700 2 WARNING nova.compute.manager [req-e5617870-bbd9-4ed7-ad14-b612fe2ea0be req-d1511136-093a-4697-9aa7-52316c03be54 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Received unexpected event network-vif-plugged-44a32614-55d7-4ef1-a5fd-a40fcb2f1932 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.443 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.443 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.444 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.444 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.444 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.446 2 INFO nova.compute.manager [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Terminating instance#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.447 2 DEBUG nova.compute.manager [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:16:57 np0005474864 kernel: tapb88a1e01-9e (unregistering): left promiscuous mode
Oct  7 16:16:57 np0005474864 NetworkManager[51631]: <info>  [1759868217.4707] device (tapb88a1e01-9e): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.489 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:57Z|00163|binding|INFO|Releasing lport b88a1e01-9e7f-49da-9f11-434972486fb7 from this chassis (sb_readonly=0)
Oct  7 16:16:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:57Z|00164|binding|INFO|Setting lport b88a1e01-9e7f-49da-9f11-434972486fb7 down in Southbound
Oct  7 16:16:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:57Z|00165|binding|INFO|Removing iface tapb88a1e01-9e ovn-installed in OVS
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.495 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.498 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a0:2b:b7 10.100.0.9'], port_security=['fa:16:3e:a0:2b:b7 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6a1d6d4-586d-450e-8b73-6ad134098649', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34f1dcb0-f04e-41a8-8b02-05684b457dc5, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=b88a1e01-9e7f-49da-9f11-434972486fb7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.499 103685 INFO neutron.agent.ovn.metadata.agent [-] Port b88a1e01-9e7f-49da-9f11-434972486fb7 in datapath d6a1d6d4-586d-450e-8b73-6ad134098649 unbound from our chassis#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.502 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d6a1d6d4-586d-450e-8b73-6ad134098649#033[00m
Oct  7 16:16:57 np0005474864 kernel: tap9315ca92-32 (unregistering): left promiscuous mode
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.522 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 NetworkManager[51631]: <info>  [1759868217.5361] device (tap9315ca92-32): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.536 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6da03ec9-fc35-4780-be6e-b20ec8708308]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.556 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:57Z|00166|binding|INFO|Releasing lport 9315ca92-32fd-4407-a73c-c7c9440c29b8 from this chassis (sb_readonly=0)
Oct  7 16:16:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:57Z|00167|binding|INFO|Setting lport 9315ca92-32fd-4407-a73c-c7c9440c29b8 down in Southbound
Oct  7 16:16:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:16:57Z|00168|binding|INFO|Removing iface tap9315ca92-32 ovn-installed in OVS
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.570 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:72:12:14 2001:db8:0:1:f816:3eff:fe72:1214 2001:db8::f816:3eff:fe72:1214'], port_security=['fa:16:3e:72:12:14 2001:db8:0:1:f816:3eff:fe72:1214 2001:db8::f816:3eff:fe72:1214'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe72:1214/64 2001:db8::f816:3eff:fe72:1214/64', 'neutron:device_id': 'd37e7dbd-01fd-4484-9bdb-f09d24420fa7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fb1d4ce-b691-4091-872a-86df16b02e47, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=9315ca92-32fd-4407-a73c-c7c9440c29b8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.593 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[56475449-3f61-4b08-8483-86c020f14bad]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.597 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[5b9d08dd-4f8f-4e1b-b1df-6f7e0f663152]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001c.scope: Deactivated successfully.
Oct  7 16:16:57 np0005474864 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d0000001c.scope: Consumed 14.821s CPU time.
Oct  7 16:16:57 np0005474864 systemd-machined[152586]: Machine qemu-9-instance-0000001c terminated.
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.633 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[b70d86fe-62d8-41aa-8d3c-3ee7d2291a87]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.662 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[854942a7-9089-4ec2-a343-0f4dbdbb9f4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd6a1d6d4-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:c9:72'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 12, 'tx_packets': 8, 'rx_bytes': 1000, 'tx_bytes': 528, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 41], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376470, 'reachable_time': 36522, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 8, 'inoctets': 720, 'indelivers': 1, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 304, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 8, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 720, 'outmcastoctets': 304, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 8, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 1, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225282, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 NetworkManager[51631]: <info>  [1759868217.6819] manager: (tap9315ca92-32): new Tun device (/org/freedesktop/NetworkManager/Devices/87)
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.692 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3b76f49e-229f-49b8-9798-d0e5ec8c4e7d]: (4, ({'family': 2, 'prefixlen': 28, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '10.100.0.2'], ['IFA_LOCAL', '10.100.0.2'], ['IFA_BROADCAST', '10.100.0.15'], ['IFA_LABEL', 'tapd6a1d6d4-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376488, 'tstamp': 376488}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225285, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'}, {'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tapd6a1d6d4-51'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376493, 'tstamp': 376493}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225285, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'})) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.694 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6a1d6d4-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.712 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.712 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd6a1d6d4-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.713 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.713 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd6a1d6d4-50, col_values=(('external_ids', {'iface-id': '5cf38e83-5f07-4562-b663-4850a1d35f81'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.714 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.717 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 9315ca92-32fd-4407-a73c-c7c9440c29b8 in datapath 99465e0c-6ee8-477a-94aa-ab737f76f9e4 unbound from our chassis#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.721 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 99465e0c-6ee8-477a-94aa-ab737f76f9e4#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.743 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2550292d-a250-413d-8d37-c69036363097]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.744 2 INFO nova.virt.libvirt.driver [-] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Instance destroyed successfully.#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.745 2 DEBUG nova.objects.instance [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'resources' on Instance uuid d37e7dbd-01fd-4484-9bdb-f09d24420fa7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.764 2 DEBUG nova.virt.libvirt.vif [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:15:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1118891772',display_name='tempest-TestGettingAddress-server-1118891772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1118891772',id=28,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:16:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-pk9arxda',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:16:18Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d37e7dbd-01fd-4484-9bdb-f09d24420fa7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.765 2 DEBUG nova.network.os_vif_util [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "b88a1e01-9e7f-49da-9f11-434972486fb7", "address": "fa:16:3e:a0:2b:b7", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapb88a1e01-9e", "ovs_interfaceid": "b88a1e01-9e7f-49da-9f11-434972486fb7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.765 2 DEBUG nova.network.os_vif_util [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:a0:2b:b7,bridge_name='br-int',has_traffic_filtering=True,id=b88a1e01-9e7f-49da-9f11-434972486fb7,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb88a1e01-9e') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.766 2 DEBUG os_vif [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:2b:b7,bridge_name='br-int',has_traffic_filtering=True,id=b88a1e01-9e7f-49da-9f11-434972486fb7,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb88a1e01-9e') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.767 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.767 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb88a1e01-9e, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.769 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.771 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.774 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.775 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.777 2 INFO os_vif [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:a0:2b:b7,bridge_name='br-int',has_traffic_filtering=True,id=b88a1e01-9e7f-49da-9f11-434972486fb7,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapb88a1e01-9e')#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.778 2 DEBUG nova.virt.libvirt.vif [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:15:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1118891772',display_name='tempest-TestGettingAddress-server-1118891772',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1118891772',id=28,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:16:17Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-pk9arxda',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:16:18Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d37e7dbd-01fd-4484-9bdb-f09d24420fa7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.778 2 DEBUG nova.network.os_vif_util [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.779 2 DEBUG nova.network.os_vif_util [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:72:12:14,bridge_name='br-int',has_traffic_filtering=True,id=9315ca92-32fd-4407-a73c-c7c9440c29b8,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9315ca92-32') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.780 2 DEBUG os_vif [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:12:14,bridge_name='br-int',has_traffic_filtering=True,id=9315ca92-32fd-4407-a73c-c7c9440c29b8,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9315ca92-32') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.779 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[8e19e626-e395-45d7-8e90-885d96557532]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.781 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9315ca92-32, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.785 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[435dc6f8-3199-43e4-b85a-4268b4c839c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.787 2 INFO os_vif [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:72:12:14,bridge_name='br-int',has_traffic_filtering=True,id=9315ca92-32fd-4407-a73c-c7c9440c29b8,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9315ca92-32')#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.787 2 INFO nova.virt.libvirt.driver [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Deleting instance files /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7_del#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.788 2 INFO nova.virt.libvirt.driver [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Deletion of /var/lib/nova/instances/d37e7dbd-01fd-4484-9bdb-f09d24420fa7_del complete#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.823 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[b5e0955d-9ecc-44c2-b41d-2df3017c3ac4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.845 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[014d113a-3010-4ce3-82d2-e0391145c812]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap99465e0c-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:58:17:08'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 38, 'tx_packets': 6, 'rx_bytes': 3460, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 38, 'tx_packets': 6, 'rx_bytes': 3460, 'tx_bytes': 440, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 42], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376569, 'reachable_time': 34382, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 38, 'inoctets': 2928, 'indelivers': 13, 'outforwdatagrams': 0, 'outpkts': 4, 'outoctets': 300, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 38, 'outmcastpkts': 4, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 2928, 'outmcastoctets': 300, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 38, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 13, 'inerrors': 0, 'outmsgs': 4, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225319, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.848 2 INFO nova.compute.manager [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.848 2 DEBUG oslo.service.loopingcall [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.849 2 DEBUG nova.compute.manager [-] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.849 2 DEBUG nova.network.neutron [-] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.872 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1a21945c-0557-4c43-ba24-68a1e599ff50]: (4, ({'family': 2, 'prefixlen': 32, 'flags': 128, 'scope': 0, 'index': 2, 'attrs': [['IFA_ADDRESS', '169.254.169.254'], ['IFA_LOCAL', '169.254.169.254'], ['IFA_BROADCAST', '169.254.169.254'], ['IFA_LABEL', 'tap99465e0c-61'], ['IFA_FLAGS', 128], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 376582, 'tstamp': 376582}]], 'header': {'length': 96, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225320, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.875 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99465e0c-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 nova_compute[192593]: 2025-10-07 20:16:57.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.880 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap99465e0c-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.881 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.882 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap99465e0c-60, col_values=(('external_ids', {'iface-id': '3fba40e3-39d9-4871-b9b9-3e2e5088af4f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:16:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:16:57.883 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:16:58 np0005474864 nova_compute[192593]: 2025-10-07 20:16:58.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.360 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-changed-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.360 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing instance network info cache due to event network-changed-b88a1e01-9e7f-49da-9f11-434972486fb7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.361 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.361 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.362 2 DEBUG nova.network.neutron [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Refreshing network info cache for port b88a1e01-9e7f-49da-9f11-434972486fb7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.619 2 INFO nova.network.neutron [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Port b88a1e01-9e7f-49da-9f11-434972486fb7 from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.620 2 DEBUG nova.network.neutron [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updating instance_info_cache with network_info: [{"id": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "address": "fa:16:3e:72:12:14", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe72:1214", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9315ca92-32", "ovs_interfaceid": "9315ca92-32fd-4407-a73c-c7c9440c29b8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.661 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d37e7dbd-01fd-4484-9bdb-f09d24420fa7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.662 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-unplugged-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.663 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.663 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.664 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.664 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] No waiting events found dispatching network-vif-unplugged-b88a1e01-9e7f-49da-9f11-434972486fb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.665 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-unplugged-b88a1e01-9e7f-49da-9f11-434972486fb7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.665 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.665 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.666 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.666 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.667 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] No waiting events found dispatching network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.667 2 WARNING nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received unexpected event network-vif-plugged-b88a1e01-9e7f-49da-9f11-434972486fb7 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.668 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-unplugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.668 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.668 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.669 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.669 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] No waiting events found dispatching network-vif-unplugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.670 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-unplugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.670 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.671 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.671 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.671 2 DEBUG oslo_concurrency.lockutils [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.672 2 DEBUG nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] No waiting events found dispatching network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.672 2 WARNING nova.compute.manager [req-1cc7bee3-cd3c-4540-a7b2-374c751ed6e5 req-485b1e0b-790d-42aa-be51-905b8c5b06a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received unexpected event network-vif-plugged-9315ca92-32fd-4407-a73c-c7c9440c29b8 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.775 2 DEBUG nova.network.neutron [-] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.793 2 INFO nova.compute.manager [-] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Took 1.94 seconds to deallocate network for instance.#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.832 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.833 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.935 2 DEBUG nova.compute.manager [req-9a2f11dd-fae6-4788-a863-060da076b41e req-3fe28da8-f034-4fe8-829d-0c2beb4501d6 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-deleted-b88a1e01-9e7f-49da-9f11-434972486fb7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.935 2 DEBUG nova.compute.manager [req-9a2f11dd-fae6-4788-a863-060da076b41e req-3fe28da8-f034-4fe8-829d-0c2beb4501d6 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Received event network-vif-deleted-9315ca92-32fd-4407-a73c-c7c9440c29b8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.940 2 DEBUG nova.compute.provider_tree [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.955 2 DEBUG nova.scheduler.client.report [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:16:59 np0005474864 nova_compute[192593]: 2025-10-07 20:16:59.985 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:00 np0005474864 nova_compute[192593]: 2025-10-07 20:17:00.035 2 INFO nova.scheduler.client.report [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Deleted allocations for instance d37e7dbd-01fd-4484-9bdb-f09d24420fa7#033[00m
Oct  7 16:17:00 np0005474864 nova_compute[192593]: 2025-10-07 20:17:00.107 2 DEBUG oslo_concurrency.lockutils [None req-3c6c5f6e-279c-4a3e-a2a0-ace65408d208 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d37e7dbd-01fd-4484-9bdb-f09d24420fa7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.664s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:02 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:02Z|00169|binding|INFO|Releasing lport 3fba40e3-39d9-4871-b9b9-3e2e5088af4f from this chassis (sb_readonly=0)
Oct  7 16:17:02 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:02Z|00170|binding|INFO|Releasing lport 5cf38e83-5f07-4562-b663-4850a1d35f81 from this chassis (sb_readonly=0)
Oct  7 16:17:02 np0005474864 nova_compute[192593]: 2025-10-07 20:17:02.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:02 np0005474864 nova_compute[192593]: 2025-10-07 20:17:02.783 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 podman[225322]: 2025-10-07 20:17:03.414135469 +0000 UTC m=+0.099143497 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, config_id=edpm, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, maintainer=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, vendor=Red Hat, Inc.)
Oct  7 16:17:03 np0005474864 podman[225321]: 2025-10-07 20:17:03.417374092 +0000 UTC m=+0.105338255 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.520 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.542 2 DEBUG nova.compute.manager [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-changed-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.542 2 DEBUG nova.compute.manager [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing instance network info cache due to event network-changed-c1d00195-4d32-45ac-b745-1a913060f39d. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.543 2 DEBUG oslo_concurrency.lockutils [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.543 2 DEBUG oslo_concurrency.lockutils [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.543 2 DEBUG nova.network.neutron [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Refreshing network info cache for port c1d00195-4d32-45ac-b745-1a913060f39d _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.693 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.693 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.694 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.694 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.695 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.697 2 INFO nova.compute.manager [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Terminating instance#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.698 2 DEBUG nova.compute.manager [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:17:03 np0005474864 kernel: tapc1d00195-4d (unregistering): left promiscuous mode
Oct  7 16:17:03 np0005474864 NetworkManager[51631]: <info>  [1759868223.7221] device (tapc1d00195-4d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:17:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:03Z|00171|binding|INFO|Releasing lport c1d00195-4d32-45ac-b745-1a913060f39d from this chassis (sb_readonly=0)
Oct  7 16:17:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:03Z|00172|binding|INFO|Setting lport c1d00195-4d32-45ac-b745-1a913060f39d down in Southbound
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.737 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:03Z|00173|binding|INFO|Removing iface tapc1d00195-4d ovn-installed in OVS
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:03.749 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e1:c8:d3 10.100.0.6'], port_security=['fa:16:3e:e1:c8:d3 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d6a1d6d4-586d-450e-8b73-6ad134098649', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34f1dcb0-f04e-41a8-8b02-05684b457dc5, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=c1d00195-4d32-45ac-b745-1a913060f39d) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:17:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:03.752 103685 INFO neutron.agent.ovn.metadata.agent [-] Port c1d00195-4d32-45ac-b745-1a913060f39d in datapath d6a1d6d4-586d-450e-8b73-6ad134098649 unbound from our chassis#033[00m
Oct  7 16:17:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:03.756 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d6a1d6d4-586d-450e-8b73-6ad134098649, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.758 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:03.758 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc94f55-3497-45c6-9ce9-c7c2d9b0d2d7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:03.758 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649 namespace which is not needed anymore#033[00m
Oct  7 16:17:03 np0005474864 kernel: tapfb8c9e14-7c (unregistering): left promiscuous mode
Oct  7 16:17:03 np0005474864 NetworkManager[51631]: <info>  [1759868223.7691] device (tapfb8c9e14-7c): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.782 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:03Z|00174|binding|INFO|Releasing lport fb8c9e14-7c02-42cf-9fe9-afc4a7316794 from this chassis (sb_readonly=0)
Oct  7 16:17:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:03Z|00175|binding|INFO|Setting lport fb8c9e14-7c02-42cf-9fe9-afc4a7316794 down in Southbound
Oct  7 16:17:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:03Z|00176|binding|INFO|Removing iface tapfb8c9e14-7c ovn-installed in OVS
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.787 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:03.791 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:24:aa:7e 2001:db8:0:1:f816:3eff:fe24:aa7e 2001:db8::f816:3eff:fe24:aa7e'], port_security=['fa:16:3e:24:aa:7e 2001:db8:0:1:f816:3eff:fe24:aa7e 2001:db8::f816:3eff:fe24:aa7e'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe24:aa7e/64 2001:db8::f816:3eff:fe24:aa7e/64', 'neutron:device_id': 'c491b943-fbbd-46e0-be8c-74a8c1378ab3', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'a6b53ec8-0088-49b2-96e7-c4770f1b7fbc', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=7fb1d4ce-b691-4091-872a-86df16b02e47, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=fb8c9e14-7c02-42cf-9fe9-afc4a7316794) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:17:03 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.813 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:03 np0005474864 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000019.scope: Deactivated successfully.
Oct  7 16:17:03 np0005474864 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d00000019.scope: Consumed 16.930s CPU time.
Oct  7 16:17:03 np0005474864 systemd-machined[152586]: Machine qemu-8-instance-00000019 terminated.
Oct  7 16:17:03 np0005474864 NetworkManager[51631]: <info>  [1759868223.9464] manager: (tapfb8c9e14-7c): new Tun device (/org/freedesktop/NetworkManager/Devices/88)
Oct  7 16:17:03 np0005474864 neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649[224297]: [NOTICE]   (224301) : haproxy version is 2.8.14-c23fe91
Oct  7 16:17:03 np0005474864 neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649[224297]: [NOTICE]   (224301) : path to executable is /usr/sbin/haproxy
Oct  7 16:17:03 np0005474864 neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649[224297]: [WARNING]  (224301) : Exiting Master process...
Oct  7 16:17:03 np0005474864 neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649[224297]: [ALERT]    (224301) : Current worker (224303) exited with code 143 (Terminated)
Oct  7 16:17:03 np0005474864 neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649[224297]: [WARNING]  (224301) : All workers exited. Exiting... (0)
Oct  7 16:17:03 np0005474864 systemd[1]: libpod-402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696.scope: Deactivated successfully.
Oct  7 16:17:03 np0005474864 podman[225398]: 2025-10-07 20:17:03.985853228 +0000 UTC m=+0.084697294 container died 402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.997 2 INFO nova.virt.libvirt.driver [-] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Instance destroyed successfully.#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:03.998 2 DEBUG nova.objects.instance [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'resources' on Instance uuid c491b943-fbbd-46e0-be8c-74a8c1378ab3 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.019 2 DEBUG nova.virt.libvirt.vif [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:15:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1139550974',display_name='tempest-TestGettingAddress-server-1139550974',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1139550974',id=25,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:15:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-nrbptih5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:15:29Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=c491b943-fbbd-46e0-be8c-74a8c1378ab3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.019 2 DEBUG nova.network.os_vif_util [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.248", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.020 2 DEBUG nova.network.os_vif_util [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e1:c8:d3,bridge_name='br-int',has_traffic_filtering=True,id=c1d00195-4d32-45ac-b745-1a913060f39d,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d00195-4d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.021 2 DEBUG os_vif [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:c8:d3,bridge_name='br-int',has_traffic_filtering=True,id=c1d00195-4d32-45ac-b745-1a913060f39d,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d00195-4d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:17:04 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696-userdata-shm.mount: Deactivated successfully.
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.023 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.023 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc1d00195-4d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:04 np0005474864 systemd[1]: var-lib-containers-storage-overlay-2094ad2571a4bdd633e9bccdea8d7d72b71afbb1cd7df15f928f8fdbd88565c6-merged.mount: Deactivated successfully.
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.027 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:17:04 np0005474864 podman[225398]: 2025-10-07 20:17:04.031366161 +0000 UTC m=+0.130210227 container cleanup 402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.035 2 INFO os_vif [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e1:c8:d3,bridge_name='br-int',has_traffic_filtering=True,id=c1d00195-4d32-45ac-b745-1a913060f39d,network=Network(d6a1d6d4-586d-450e-8b73-6ad134098649),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc1d00195-4d')#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.036 2 DEBUG nova.virt.libvirt.vif [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:15:14Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1139550974',display_name='tempest-TestGettingAddress-server-1139550974',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1139550974',id=25,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMHh+5BPf+7kY8pKVSG/i4hyBxtn8tVeo7dRQWQLUq5e94pXRwlhKtKtONimxLXuRPnL+V+cWD3ZzAcmW1n0ScN01FDbsZi+mcaAgIQIdbWXoCLml/sEmzbizmftxgLGkQ==',key_name='tempest-TestGettingAddress-195199531',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:15:29Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-nrbptih5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:15:29Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=c491b943-fbbd-46e0-be8c-74a8c1378ab3,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.037 2 DEBUG nova.network.os_vif_util [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.039 2 DEBUG nova.network.os_vif_util [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:24:aa:7e,bridge_name='br-int',has_traffic_filtering=True,id=fb8c9e14-7c02-42cf-9fe9-afc4a7316794,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8c9e14-7c') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.039 2 DEBUG os_vif [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:aa:7e,bridge_name='br-int',has_traffic_filtering=True,id=fb8c9e14-7c02-42cf-9fe9-afc4a7316794,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8c9e14-7c') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.041 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.041 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb8c9e14-7c, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.046 2 INFO os_vif [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:24:aa:7e,bridge_name='br-int',has_traffic_filtering=True,id=fb8c9e14-7c02-42cf-9fe9-afc4a7316794,network=Network(99465e0c-6ee8-477a-94aa-ab737f76f9e4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8c9e14-7c')#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.047 2 INFO nova.virt.libvirt.driver [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Deleting instance files /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3_del#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.048 2 INFO nova.virt.libvirt.driver [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Deletion of /var/lib/nova/instances/c491b943-fbbd-46e0-be8c-74a8c1378ab3_del complete#033[00m
Oct  7 16:17:04 np0005474864 systemd[1]: libpod-conmon-402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696.scope: Deactivated successfully.
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.106 2 INFO nova.compute.manager [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.107 2 DEBUG oslo.service.loopingcall [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.108 2 DEBUG nova.compute.manager [-] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.108 2 DEBUG nova.network.neutron [-] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:17:04 np0005474864 podman[225454]: 2025-10-07 20:17:04.120362457 +0000 UTC m=+0.054936883 container remove 402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.125 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[21243995-dc56-4546-a82d-c40f5f63d2c0]: (4, ('Tue Oct  7 08:17:03 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649 (402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696)\n402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696\nTue Oct  7 08:17:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649 (402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696)\n402e48da0677afcc111abb888c3cdc1bab9e26fd5b9f5aa088e6308b01cc5696\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.127 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a6ced085-4fb7-4b70-9718-78f0c7f75454]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.127 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd6a1d6d4-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:04 np0005474864 kernel: tapd6a1d6d4-50: left promiscuous mode
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.156 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6723b6c6-1f59-4e79-87ba-c79e3fc24458]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.187 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4f9964c8-042d-480b-b8e0-aa6830e340fb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.189 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[97b113a7-1a42-44c9-a766-7fb54d3dd22a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.210 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[08162b2b-acab-4b7d-bdc9-d5527bb0f9f8]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376463, 'reachable_time': 16560, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225468, 'error': None, 'target': 'ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 systemd[1]: run-netns-ovnmeta\x2dd6a1d6d4\x2d586d\x2d450e\x2d8b73\x2d6ad134098649.mount: Deactivated successfully.
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.214 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d6a1d6d4-586d-450e-8b73-6ad134098649 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.215 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[c79b9db3-eafd-429e-8ce2-bf7ac67403be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.216 103685 INFO neutron.agent.ovn.metadata.agent [-] Port fb8c9e14-7c02-42cf-9fe9-afc4a7316794 in datapath 99465e0c-6ee8-477a-94aa-ab737f76f9e4 unbound from our chassis#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.219 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 99465e0c-6ee8-477a-94aa-ab737f76f9e4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.220 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4a89091a-136f-499e-8d0a-624c4c927ab9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.221 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4 namespace which is not needed anymore#033[00m
Oct  7 16:17:04 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [NOTICE]   (224393) : haproxy version is 2.8.14-c23fe91
Oct  7 16:17:04 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [NOTICE]   (224393) : path to executable is /usr/sbin/haproxy
Oct  7 16:17:04 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [WARNING]  (224393) : Exiting Master process...
Oct  7 16:17:04 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [WARNING]  (224393) : Exiting Master process...
Oct  7 16:17:04 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [ALERT]    (224393) : Current worker (224398) exited with code 143 (Terminated)
Oct  7 16:17:04 np0005474864 neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4[224370]: [WARNING]  (224393) : All workers exited. Exiting... (0)
Oct  7 16:17:04 np0005474864 systemd[1]: libpod-6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce.scope: Deactivated successfully.
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.361 2 DEBUG nova.compute.manager [req-9bcf72be-409f-429b-8bf3-035fb79e1cb8 req-9d6a6981-b3fc-4559-b24e-bf02fa68cf92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-unplugged-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.362 2 DEBUG oslo_concurrency.lockutils [req-9bcf72be-409f-429b-8bf3-035fb79e1cb8 req-9d6a6981-b3fc-4559-b24e-bf02fa68cf92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.362 2 DEBUG oslo_concurrency.lockutils [req-9bcf72be-409f-429b-8bf3-035fb79e1cb8 req-9d6a6981-b3fc-4559-b24e-bf02fa68cf92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.363 2 DEBUG oslo_concurrency.lockutils [req-9bcf72be-409f-429b-8bf3-035fb79e1cb8 req-9d6a6981-b3fc-4559-b24e-bf02fa68cf92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.364 2 DEBUG nova.compute.manager [req-9bcf72be-409f-429b-8bf3-035fb79e1cb8 req-9d6a6981-b3fc-4559-b24e-bf02fa68cf92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] No waiting events found dispatching network-vif-unplugged-c1d00195-4d32-45ac-b745-1a913060f39d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.364 2 DEBUG nova.compute.manager [req-9bcf72be-409f-429b-8bf3-035fb79e1cb8 req-9d6a6981-b3fc-4559-b24e-bf02fa68cf92 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-unplugged-c1d00195-4d32-45ac-b745-1a913060f39d for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:17:04 np0005474864 podman[225486]: 2025-10-07 20:17:04.365882761 +0000 UTC m=+0.052486483 container died 6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:17:04 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce-userdata-shm.mount: Deactivated successfully.
Oct  7 16:17:04 np0005474864 systemd[1]: var-lib-containers-storage-overlay-5e49f475b67c5c81ba39c9e68fdd287f05ea08d70167bb01844cd69ee5668480-merged.mount: Deactivated successfully.
Oct  7 16:17:04 np0005474864 podman[225486]: 2025-10-07 20:17:04.405728341 +0000 UTC m=+0.092332053 container cleanup 6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  7 16:17:04 np0005474864 systemd[1]: libpod-conmon-6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce.scope: Deactivated successfully.
Oct  7 16:17:04 np0005474864 podman[225515]: 2025-10-07 20:17:04.483670582 +0000 UTC m=+0.053003378 container remove 6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.492 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[36797a8f-aabd-4865-b380-117ea100c5b7]: (4, ('Tue Oct  7 08:17:04 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4 (6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce)\n6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce\nTue Oct  7 08:17:04 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4 (6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce)\n6231a4e01d581b968a285e283969363895d12513731767a4ba8d878b2fe322ce\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.495 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[18b71301-3c8c-4ac2-854e-3e0abab1600e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.496 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap99465e0c-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:04 np0005474864 kernel: tap99465e0c-60: left promiscuous mode
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.498 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.512 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9174e08e-61e4-4571-9257-7c1864ae2aef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 nova_compute[192593]: 2025-10-07 20:17:04.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.555 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[db2cafe5-df82-4633-8e8e-8a0190fedc25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.557 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[077fa8b9-2046-42af-ba9e-33b6aae897b3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.586 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[be254d12-e52c-4067-a182-98d7a69f10ed]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 376562, 'reachable_time': 21414, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225530, 'error': None, 'target': 'ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.589 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-99465e0c-6ee8-477a-94aa-ab737f76f9e4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:17:04 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:04.589 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[0dd25613-ecc9-4f1d-921e-66ebbca8edd2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:05 np0005474864 systemd[1]: run-netns-ovnmeta\x2d99465e0c\x2d6ee8\x2d477a\x2d94aa\x2dab737f76f9e4.mount: Deactivated successfully.
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.179 2 DEBUG nova.compute.manager [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-unplugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.180 2 DEBUG oslo_concurrency.lockutils [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.180 2 DEBUG oslo_concurrency.lockutils [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.181 2 DEBUG oslo_concurrency.lockutils [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.181 2 DEBUG nova.compute.manager [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] No waiting events found dispatching network-vif-unplugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.182 2 DEBUG nova.compute.manager [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-unplugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.183 2 DEBUG nova.compute.manager [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.183 2 DEBUG oslo_concurrency.lockutils [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.184 2 DEBUG oslo_concurrency.lockutils [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.184 2 DEBUG oslo_concurrency.lockutils [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.185 2 DEBUG nova.compute.manager [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] No waiting events found dispatching network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.185 2 WARNING nova.compute.manager [req-992d5256-253b-4676-8efd-34afb2415235 req-49c48c72-2cfb-4b15-b17d-b1aadcc04570 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received unexpected event network-vif-plugged-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.209 2 DEBUG nova.network.neutron [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updated VIF entry in instance network info cache for port c1d00195-4d32-45ac-b745-1a913060f39d. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.209 2 DEBUG nova.network.neutron [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updating instance_info_cache with network_info: [{"id": "c1d00195-4d32-45ac-b745-1a913060f39d", "address": "fa:16:3e:e1:c8:d3", "network": {"id": "d6a1d6d4-586d-450e-8b73-6ad134098649", "bridge": "br-int", "label": "tempest-network-smoke--1226703527", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc1d00195-4d", "ovs_interfaceid": "c1d00195-4d32-45ac-b745-1a913060f39d", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "address": "fa:16:3e:24:aa:7e", "network": {"id": "99465e0c-6ee8-477a-94aa-ab737f76f9e4", "bridge": "br-int", "label": "tempest-network-smoke--694715509", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe24:aa7e", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8c9e14-7c", "ovs_interfaceid": "fb8c9e14-7c02-42cf-9fe9-afc4a7316794", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.231 2 DEBUG oslo_concurrency.lockutils [req-d6e25ce3-5866-4649-ad5f-a5aecd2906be req-0064d7f3-4d1a-477a-b237-b319f459a1d9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-c491b943-fbbd-46e0-be8c-74a8c1378ab3" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.457 2 DEBUG nova.compute.manager [req-099386c4-8528-4615-8558-239f2b51a886 req-8089de04-34f0-4b34-bac9-50371ffdf8f2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.458 2 DEBUG oslo_concurrency.lockutils [req-099386c4-8528-4615-8558-239f2b51a886 req-8089de04-34f0-4b34-bac9-50371ffdf8f2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.459 2 DEBUG oslo_concurrency.lockutils [req-099386c4-8528-4615-8558-239f2b51a886 req-8089de04-34f0-4b34-bac9-50371ffdf8f2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.459 2 DEBUG oslo_concurrency.lockutils [req-099386c4-8528-4615-8558-239f2b51a886 req-8089de04-34f0-4b34-bac9-50371ffdf8f2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.459 2 DEBUG nova.compute.manager [req-099386c4-8528-4615-8558-239f2b51a886 req-8089de04-34f0-4b34-bac9-50371ffdf8f2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] No waiting events found dispatching network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.460 2 WARNING nova.compute.manager [req-099386c4-8528-4615-8558-239f2b51a886 req-8089de04-34f0-4b34-bac9-50371ffdf8f2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received unexpected event network-vif-plugged-c1d00195-4d32-45ac-b745-1a913060f39d for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.763 2 DEBUG nova.network.neutron [-] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.791 2 INFO nova.compute.manager [-] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Took 2.68 seconds to deallocate network for instance.#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.836 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.837 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.895 2 DEBUG nova.compute.provider_tree [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.910 2 DEBUG nova.scheduler.client.report [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.932 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.095s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:06 np0005474864 nova_compute[192593]: 2025-10-07 20:17:06.958 2 INFO nova.scheduler.client.report [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Deleted allocations for instance c491b943-fbbd-46e0-be8c-74a8c1378ab3#033[00m
Oct  7 16:17:07 np0005474864 nova_compute[192593]: 2025-10-07 20:17:07.042 2 DEBUG oslo_concurrency.lockutils [None req-39c2544c-17c3-47b5-8966-009c35134ec4 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "c491b943-fbbd-46e0-be8c-74a8c1378ab3" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:07 np0005474864 nova_compute[192593]: 2025-10-07 20:17:07.770 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868212.768841, 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:17:07 np0005474864 nova_compute[192593]: 2025-10-07 20:17:07.771 2 INFO nova.compute.manager [-] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:17:07 np0005474864 nova_compute[192593]: 2025-10-07 20:17:07.796 2 DEBUG nova.compute.manager [None req-7c872dd5-ea2e-48ac-a2a0-81a99bc7913f - - - - - -] [instance: 01ae62c6-0dbd-452f-b4af-a4d7cde1ddcc] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:17:08 np0005474864 nova_compute[192593]: 2025-10-07 20:17:08.285 2 DEBUG nova.compute.manager [req-602df4f9-6f5f-492e-9d9a-5c2dcb6b4277 req-b16e10a9-37c6-4526-b120-684de78ec9f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-deleted-fb8c9e14-7c02-42cf-9fe9-afc4a7316794 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:08 np0005474864 nova_compute[192593]: 2025-10-07 20:17:08.286 2 DEBUG nova.compute.manager [req-602df4f9-6f5f-492e-9d9a-5c2dcb6b4277 req-b16e10a9-37c6-4526-b120-684de78ec9f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Received event network-vif-deleted-c1d00195-4d32-45ac-b745-1a913060f39d external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:08 np0005474864 nova_compute[192593]: 2025-10-07 20:17:08.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:09 np0005474864 nova_compute[192593]: 2025-10-07 20:17:09.044 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:09 np0005474864 podman[225531]: 2025-10-07 20:17:09.384113839 +0000 UTC m=+0.073608117 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:17:09 np0005474864 podman[225533]: 2025-10-07 20:17:09.415205609 +0000 UTC m=+0.094217137 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:17:09 np0005474864 nova_compute[192593]: 2025-10-07 20:17:09.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:09 np0005474864 podman[225532]: 2025-10-07 20:17:09.496503695 +0000 UTC m=+0.171699724 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:17:12 np0005474864 nova_compute[192593]: 2025-10-07 20:17:12.741 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868217.7400708, d37e7dbd-01fd-4484-9bdb-f09d24420fa7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:17:12 np0005474864 nova_compute[192593]: 2025-10-07 20:17:12.742 2 INFO nova.compute.manager [-] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:17:12 np0005474864 nova_compute[192593]: 2025-10-07 20:17:12.790 2 DEBUG nova.compute.manager [None req-73bb1c0b-e0d0-4731-8e34-b2cd7a0a1138 - - - - - -] [instance: d37e7dbd-01fd-4484-9bdb-f09d24420fa7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:17:13 np0005474864 nova_compute[192593]: 2025-10-07 20:17:13.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:13 np0005474864 nova_compute[192593]: 2025-10-07 20:17:13.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:14 np0005474864 nova_compute[192593]: 2025-10-07 20:17:14.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:14 np0005474864 nova_compute[192593]: 2025-10-07 20:17:14.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:14 np0005474864 nova_compute[192593]: 2025-10-07 20:17:14.475 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:15 np0005474864 podman[225592]: 2025-10-07 20:17:15.385705497 +0000 UTC m=+0.074697963 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  7 16:17:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:16.191 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:16.192 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:16.192 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:18 np0005474864 podman[225611]: 2025-10-07 20:17:18.385907432 +0000 UTC m=+0.076556396 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:17:18 np0005474864 nova_compute[192593]: 2025-10-07 20:17:18.561 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:18 np0005474864 nova_compute[192593]: 2025-10-07 20:17:18.995 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868223.993833, c491b943-fbbd-46e0-be8c-74a8c1378ab3 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:17:18 np0005474864 nova_compute[192593]: 2025-10-07 20:17:18.995 2 INFO nova.compute.manager [-] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:17:19 np0005474864 nova_compute[192593]: 2025-10-07 20:17:19.049 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:19 np0005474864 nova_compute[192593]: 2025-10-07 20:17:19.057 2 DEBUG nova.compute.manager [None req-958221bb-86c7-496b-855c-2a07869e4bef - - - - - -] [instance: c491b943-fbbd-46e0-be8c-74a8c1378ab3] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:17:20 np0005474864 nova_compute[192593]: 2025-10-07 20:17:20.816 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:20 np0005474864 nova_compute[192593]: 2025-10-07 20:17:20.816 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:20 np0005474864 nova_compute[192593]: 2025-10-07 20:17:20.836 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:17:20 np0005474864 nova_compute[192593]: 2025-10-07 20:17:20.944 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:20 np0005474864 nova_compute[192593]: 2025-10-07 20:17:20.945 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:20 np0005474864 nova_compute[192593]: 2025-10-07 20:17:20.955 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:17:20 np0005474864 nova_compute[192593]: 2025-10-07 20:17:20.956 2 INFO nova.compute.claims [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.134 2 DEBUG nova.compute.provider_tree [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.154 2 DEBUG nova.scheduler.client.report [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.202 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.257s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.203 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.257 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.257 2 DEBUG nova.network.neutron [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.280 2 INFO nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.322 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:17:21 np0005474864 podman[225637]: 2025-10-07 20:17:21.429129377 +0000 UTC m=+0.115228411 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, managed_by=edpm_ansible)
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.463 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.466 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.466 2 INFO nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Creating image(s)#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.467 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "/var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.468 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "/var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.469 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "/var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.498 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.529 2 DEBUG nova.policy [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'ab16a639b2af44c7bc4218a1b1b91068', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '4b69ac5dc2b44912af0aa0671c7e3696', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.587 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.090s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.589 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.590 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.612 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.697 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.699 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.736 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.737 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.147s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.738 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.794 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.796 2 DEBUG nova.virt.disk.api [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Checking if we can resize image /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.796 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.891 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.892 2 DEBUG nova.virt.disk.api [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Cannot resize image /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.893 2 DEBUG nova.objects.instance [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lazy-loading 'migration_context' on Instance uuid fb25f45d-8789-4dda-9e61-b950ed2aa282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.921 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.922 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Ensure instance console log exists: /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.922 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.923 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:21 np0005474864 nova_compute[192593]: 2025-10-07 20:17:21.923 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.116 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.117 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.117 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.117 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.261 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.262 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5752MB free_disk=73.46330261230469GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.262 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.262 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.327 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance fb25f45d-8789-4dda-9e61-b950ed2aa282 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.327 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.327 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.376 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.394 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.420 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.420 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.158s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:22 np0005474864 nova_compute[192593]: 2025-10-07 20:17:22.891 2 DEBUG nova.network.neutron [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Successfully created port: 33558a12-26ac-4c74-ae48-99d8a83ec581 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.878 2 DEBUG nova.network.neutron [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Successfully updated port: 33558a12-26ac-4c74-ae48-99d8a83ec581 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.893 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.894 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquired lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.894 2 DEBUG nova.network.neutron [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.993 2 DEBUG nova.compute.manager [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-changed-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.993 2 DEBUG nova.compute.manager [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Refreshing instance network info cache due to event network-changed-33558a12-26ac-4c74-ae48-99d8a83ec581. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:17:23 np0005474864 nova_compute[192593]: 2025-10-07 20:17:23.994 2 DEBUG oslo_concurrency.lockutils [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:17:24 np0005474864 nova_compute[192593]: 2025-10-07 20:17:24.051 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:24 np0005474864 nova_compute[192593]: 2025-10-07 20:17:24.087 2 DEBUG nova.network.neutron [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:17:24 np0005474864 nova_compute[192593]: 2025-10-07 20:17:24.422 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:24 np0005474864 nova_compute[192593]: 2025-10-07 20:17:24.423 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.152 2 DEBUG nova.network.neutron [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updating instance_info_cache with network_info: [{"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.176 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Releasing lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.177 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Instance network_info: |[{"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.177 2 DEBUG oslo_concurrency.lockutils [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.178 2 DEBUG nova.network.neutron [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Refreshing network info cache for port 33558a12-26ac-4c74-ae48-99d8a83ec581 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.182 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Start _get_guest_xml network_info=[{"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.188 2 WARNING nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.197 2 DEBUG nova.virt.libvirt.host [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.198 2 DEBUG nova.virt.libvirt.host [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.207 2 DEBUG nova.virt.libvirt.host [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.208 2 DEBUG nova.virt.libvirt.host [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.209 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.209 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.210 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.210 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.210 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.210 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.210 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.211 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.211 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.211 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.211 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.212 2 DEBUG nova.virt.hardware [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.216 2 DEBUG nova.virt.libvirt.vif [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:17:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-access_point-7354237',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-access_point-7354237',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1464343180-ac',id=33,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3gm4UcondpF65aN14lK1zwo3U3svklb77Uy+Ar5nFz++nGgl5mwqnzq8MS6J4RXahfIN3NZO7PhCbOaLeLJQFJJuGz0cP20VpVyqPiltkCJyRHWx/0VEeD4dTAkL4r7A==',key_name='tempest-TestSecurityGroupsBasicOps-367207505',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b69ac5dc2b44912af0aa0671c7e3696',ramdisk_id='',reservation_id='r-d3z8g0do',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1464343180',owner_user_name='tempest-TestSecurityGroupsBasicOps-1464343180-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:17:21Z,user_data=None,user_id='ab16a639b2af44c7bc4218a1b1b91068',uuid=fb25f45d-8789-4dda-9e61-b950ed2aa282,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.216 2 DEBUG nova.network.os_vif_util [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converting VIF {"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.217 2 DEBUG nova.network.os_vif_util [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:38:05,bridge_name='br-int',has_traffic_filtering=True,id=33558a12-26ac-4c74-ae48-99d8a83ec581,network=Network(debddd0e-a73a-4300-8608-f32a09aaf5f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33558a12-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.218 2 DEBUG nova.objects.instance [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lazy-loading 'pci_devices' on Instance uuid fb25f45d-8789-4dda-9e61-b950ed2aa282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.237 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <uuid>fb25f45d-8789-4dda-9e61-b950ed2aa282</uuid>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <name>instance-00000021</name>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-access_point-7354237</nova:name>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:17:25</nova:creationTime>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:user uuid="ab16a639b2af44c7bc4218a1b1b91068">tempest-TestSecurityGroupsBasicOps-1464343180-project-member</nova:user>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:project uuid="4b69ac5dc2b44912af0aa0671c7e3696">tempest-TestSecurityGroupsBasicOps-1464343180</nova:project>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        <nova:port uuid="33558a12-26ac-4c74-ae48-99d8a83ec581">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <entry name="serial">fb25f45d-8789-4dda-9e61-b950ed2aa282</entry>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <entry name="uuid">fb25f45d-8789-4dda-9e61-b950ed2aa282</entry>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk.config"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:eb:38:05"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <target dev="tap33558a12-26"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/console.log" append="off"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:17:25 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:17:25 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:17:25 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:17:25 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.239 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Preparing to wait for external event network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.240 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.240 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.240 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.242 2 DEBUG nova.virt.libvirt.vif [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:17:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-access_point-7354237',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-access_point-7354237',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1464343180-ac',id=33,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3gm4UcondpF65aN14lK1zwo3U3svklb77Uy+Ar5nFz++nGgl5mwqnzq8MS6J4RXahfIN3NZO7PhCbOaLeLJQFJJuGz0cP20VpVyqPiltkCJyRHWx/0VEeD4dTAkL4r7A==',key_name='tempest-TestSecurityGroupsBasicOps-367207505',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4b69ac5dc2b44912af0aa0671c7e3696',ramdisk_id='',reservation_id='r-d3z8g0do',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-1464343180',owner_user_name='tempest-TestSecurityGroupsBasicOps-1464343180-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:17:21Z,user_data=None,user_id='ab16a639b2af44c7bc4218a1b1b91068',uuid=fb25f45d-8789-4dda-9e61-b950ed2aa282,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.242 2 DEBUG nova.network.os_vif_util [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converting VIF {"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.243 2 DEBUG nova.network.os_vif_util [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:eb:38:05,bridge_name='br-int',has_traffic_filtering=True,id=33558a12-26ac-4c74-ae48-99d8a83ec581,network=Network(debddd0e-a73a-4300-8608-f32a09aaf5f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33558a12-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.244 2 DEBUG os_vif [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:38:05,bridge_name='br-int',has_traffic_filtering=True,id=33558a12-26ac-4c74-ae48-99d8a83ec581,network=Network(debddd0e-a73a-4300-8608-f32a09aaf5f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33558a12-26') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.245 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.245 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.246 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.252 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap33558a12-26, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.253 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap33558a12-26, col_values=(('external_ids', {'iface-id': '33558a12-26ac-4c74-ae48-99d8a83ec581', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:eb:38:05', 'vm-uuid': 'fb25f45d-8789-4dda-9e61-b950ed2aa282'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.255 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:25 np0005474864 NetworkManager[51631]: <info>  [1759868245.2571] manager: (tap33558a12-26): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/89)
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.264 2 INFO os_vif [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:eb:38:05,bridge_name='br-int',has_traffic_filtering=True,id=33558a12-26ac-4c74-ae48-99d8a83ec581,network=Network(debddd0e-a73a-4300-8608-f32a09aaf5f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33558a12-26')#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.321 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.322 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.322 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] No VIF found with MAC fa:16:3e:eb:38:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.323 2 INFO nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Using config drive#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.736 2 INFO nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Creating config drive at /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk.config#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.742 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9xctsipy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.880 2 DEBUG oslo_concurrency.processutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9xctsipy" returned: 0 in 0.138s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:17:25 np0005474864 kernel: tap33558a12-26: entered promiscuous mode
Oct  7 16:17:25 np0005474864 NetworkManager[51631]: <info>  [1759868245.9764] manager: (tap33558a12-26): new Tun device (/org/freedesktop/NetworkManager/Devices/90)
Oct  7 16:17:25 np0005474864 nova_compute[192593]: 2025-10-07 20:17:25.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:25Z|00177|binding|INFO|Claiming lport 33558a12-26ac-4c74-ae48-99d8a83ec581 for this chassis.
Oct  7 16:17:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:25Z|00178|binding|INFO|33558a12-26ac-4c74-ae48-99d8a83ec581: Claiming fa:16:3e:eb:38:05 10.100.0.9
Oct  7 16:17:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:25.990 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:38:05 10.100.0.9'], port_security=['fa:16:3e:eb:38:05 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fb25f45d-8789-4dda-9e61-b950ed2aa282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b69ac5dc2b44912af0aa0671c7e3696', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0c85c855-3e12-49e2-82ce-13ef6b154141 cdc3e318-3d5f-4be4-aa5a-39fbe79b4fa3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ee087c7-bf03-4d87-9556-c31792623fff, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=33558a12-26ac-4c74-ae48-99d8a83ec581) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:17:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:25.991 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 33558a12-26ac-4c74-ae48-99d8a83ec581 in datapath debddd0e-a73a-4300-8608-f32a09aaf5f5 bound to our chassis#033[00m
Oct  7 16:17:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:25.993 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network debddd0e-a73a-4300-8608-f32a09aaf5f5#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.013 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[68f9b04b-197f-4262-95d6-3402a4765966]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.015 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdebddd0e-a1 in ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.018 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdebddd0e-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.018 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4f231310-a608-4b02-bdf2-6e359156b241]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.019 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[06e563de-f2c5-42f0-bfab-14cb30d6fd3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 systemd-udevd[225693]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.034 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[505615e1-3f32-4e50-8bd7-cd964e553854]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 NetworkManager[51631]: <info>  [1759868246.0459] device (tap33558a12-26): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:17:26 np0005474864 NetworkManager[51631]: <info>  [1759868246.0477] device (tap33558a12-26): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:17:26 np0005474864 systemd-machined[152586]: New machine qemu-11-instance-00000021.
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.059 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.064 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:26 np0005474864 systemd[1]: Started Virtual Machine qemu-11-instance-00000021.
Oct  7 16:17:26 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:26Z|00179|binding|INFO|Setting lport 33558a12-26ac-4c74-ae48-99d8a83ec581 ovn-installed in OVS
Oct  7 16:17:26 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:26Z|00180|binding|INFO|Setting lport 33558a12-26ac-4c74-ae48-99d8a83ec581 up in Southbound
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.069 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[185a71f9-39bc-4139-926e-1b36ec8e3532]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.119 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[13a18864-1752-4bdb-ba15-4faace2777d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.128 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f393c840-8a88-431b-9dd4-1444b9d6e2f2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 NetworkManager[51631]: <info>  [1759868246.1298] manager: (tapdebddd0e-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/91)
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.130 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.131 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.132 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.184 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[0be6330d-ba86-4a2d-acf0-72f822cc7f5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.189 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[13974081-f510-45ee-8613-9d991748a503]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 NetworkManager[51631]: <info>  [1759868246.2287] device (tapdebddd0e-a0): carrier: link connected
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.239 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[c6751028-b1c0-4ee6-af4d-757e3d82602f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.268 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[bbf65caa-d036-422d-ae2d-47f2cd7f6e7b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdebddd0e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:09:e1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388212, 'reachable_time': 29769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225727, 'error': None, 'target': 'ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.293 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c4283972-8116-4033-b47c-c0be13f40496]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe57:9e1'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 388212, 'tstamp': 388212}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225730, 'error': None, 'target': 'ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.320 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2a316f40-5903-4115-8e3b-df67ff90fbf9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdebddd0e-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:57:09:e1'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 110, 'tx_bytes': 180, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 53], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388212, 'reachable_time': 29769, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 152, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 152, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225734, 'error': None, 'target': 'ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.370 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7f6d59-3a35-49be-bad0-6286e07772f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.472 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c608363b-c924-4af6-a0a4-01adc632947a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.474 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdebddd0e-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.474 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.475 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdebddd0e-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.477 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:26 np0005474864 NetworkManager[51631]: <info>  [1759868246.4778] manager: (tapdebddd0e-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/92)
Oct  7 16:17:26 np0005474864 kernel: tapdebddd0e-a0: entered promiscuous mode
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.479 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdebddd0e-a0, col_values=(('external_ids', {'iface-id': '8ba4b7bf-17cf-40ee-86c3-b763b016c353'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:26 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:26Z|00181|binding|INFO|Releasing lport 8ba4b7bf-17cf-40ee-86c3-b763b016c353 from this chassis (sb_readonly=0)
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.482 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.482 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/debddd0e-a73a-4300-8608-f32a09aaf5f5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/debddd0e-a73a-4300-8608-f32a09aaf5f5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.484 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[056e8238-f046-4602-91f3-34dff41c67aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.484 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-debddd0e-a73a-4300-8608-f32a09aaf5f5
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/debddd0e-a73a-4300-8608-f32a09aaf5f5.pid.haproxy
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID debddd0e-a73a-4300-8608-f32a09aaf5f5
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:17:26 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:26.486 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'env', 'PROCESS_TAG=haproxy-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/debddd0e-a73a-4300-8608-f32a09aaf5f5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.493 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.868 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868246.868086, fb25f45d-8789-4dda-9e61-b950ed2aa282 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.869 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] VM Started (Lifecycle Event)#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.889 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.893 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868246.869238, fb25f45d-8789-4dda-9e61-b950ed2aa282 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.893 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.914 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.917 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:17:26 np0005474864 nova_compute[192593]: 2025-10-07 20:17:26.938 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:17:26 np0005474864 podman[225768]: 2025-10-07 20:17:26.861186174 +0000 UTC m=+0.029879981 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:17:26 np0005474864 podman[225768]: 2025-10-07 20:17:26.963402839 +0000 UTC m=+0.132096636 container create e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:17:27 np0005474864 systemd[1]: Started libpod-conmon-e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711.scope.
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.058 2 DEBUG nova.network.neutron [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updated VIF entry in instance network info cache for port 33558a12-26ac-4c74-ae48-99d8a83ec581. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.060 2 DEBUG nova.network.neutron [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updating instance_info_cache with network_info: [{"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:17:27 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:17:27 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6da8f18a01750e094d2b48ecdea270840a3b1446974a0f23ee4798e28666f5ba/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.077 2 DEBUG oslo_concurrency.lockutils [req-00a104d6-5fd1-4291-94d8-6fdc9af35a46 req-2d2b129a-2b9f-45a1-b17d-54a5644e6597 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:17:27 np0005474864 podman[225768]: 2025-10-07 20:17:27.130144772 +0000 UTC m=+0.298838569 container init e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:17:27 np0005474864 podman[225768]: 2025-10-07 20:17:27.141592982 +0000 UTC m=+0.310286749 container start e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  7 16:17:27 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [NOTICE]   (225787) : New worker (225789) forked
Oct  7 16:17:27 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [NOTICE]   (225787) : Loading success.
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.864 2 DEBUG nova.compute.manager [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.865 2 DEBUG oslo_concurrency.lockutils [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.865 2 DEBUG oslo_concurrency.lockutils [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.866 2 DEBUG oslo_concurrency.lockutils [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.866 2 DEBUG nova.compute.manager [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Processing event network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.867 2 DEBUG nova.compute.manager [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.867 2 DEBUG oslo_concurrency.lockutils [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.868 2 DEBUG oslo_concurrency.lockutils [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.868 2 DEBUG oslo_concurrency.lockutils [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.868 2 DEBUG nova.compute.manager [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] No waiting events found dispatching network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.869 2 WARNING nova.compute.manager [req-152a2987-73d2-44d3-9dca-91d656cc33c2 req-c72b2e02-4cf5-43e3-a6ca-708ed617ceab 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received unexpected event network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.870 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.874 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868247.8744318, fb25f45d-8789-4dda-9e61-b950ed2aa282 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.875 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.877 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.881 2 INFO nova.virt.libvirt.driver [-] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Instance spawned successfully.#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.881 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.897 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.904 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.907 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.908 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.908 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.908 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.909 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.909 2 DEBUG nova.virt.libvirt.driver [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.942 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.997 2 INFO nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Took 6.53 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:17:27 np0005474864 nova_compute[192593]: 2025-10-07 20:17:27.997 2 DEBUG nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:17:28 np0005474864 nova_compute[192593]: 2025-10-07 20:17:28.092 2 INFO nova.compute.manager [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Took 7.19 seconds to build instance.#033[00m
Oct  7 16:17:28 np0005474864 nova_compute[192593]: 2025-10-07 20:17:28.116 2 DEBUG oslo_concurrency.lockutils [None req-63b11b2b-168a-4fb5-a68c-fd39cae121bb ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.300s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:17:28 np0005474864 nova_compute[192593]: 2025-10-07 20:17:28.126 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:28 np0005474864 nova_compute[192593]: 2025-10-07 20:17:28.657 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:29 np0005474864 nova_compute[192593]: 2025-10-07 20:17:29.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:29 np0005474864 nova_compute[192593]: 2025-10-07 20:17:29.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:17:30 np0005474864 nova_compute[192593]: 2025-10-07 20:17:30.095 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:17:30 np0005474864 nova_compute[192593]: 2025-10-07 20:17:30.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:31 np0005474864 nova_compute[192593]: 2025-10-07 20:17:31.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:31 np0005474864 NetworkManager[51631]: <info>  [1759868251.2290] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Oct  7 16:17:31 np0005474864 NetworkManager[51631]: <info>  [1759868251.2307] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Oct  7 16:17:31 np0005474864 nova_compute[192593]: 2025-10-07 20:17:31.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:31Z|00182|binding|INFO|Releasing lport 8ba4b7bf-17cf-40ee-86c3-b763b016c353 from this chassis (sb_readonly=0)
Oct  7 16:17:31 np0005474864 nova_compute[192593]: 2025-10-07 20:17:31.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:32 np0005474864 nova_compute[192593]: 2025-10-07 20:17:32.376 2 DEBUG nova.compute.manager [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-changed-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:17:32 np0005474864 nova_compute[192593]: 2025-10-07 20:17:32.376 2 DEBUG nova.compute.manager [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Refreshing instance network info cache due to event network-changed-33558a12-26ac-4c74-ae48-99d8a83ec581. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:17:32 np0005474864 nova_compute[192593]: 2025-10-07 20:17:32.377 2 DEBUG oslo_concurrency.lockutils [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:17:32 np0005474864 nova_compute[192593]: 2025-10-07 20:17:32.377 2 DEBUG oslo_concurrency.lockutils [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:17:32 np0005474864 nova_compute[192593]: 2025-10-07 20:17:32.377 2 DEBUG nova.network.neutron [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Refreshing network info cache for port 33558a12-26ac-4c74-ae48-99d8a83ec581 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:17:33 np0005474864 nova_compute[192593]: 2025-10-07 20:17:33.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:34 np0005474864 podman[225801]: 2025-10-07 20:17:34.393297668 +0000 UTC m=+0.081560251 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.openshift.tags=minimal rhel9, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, release=1755695350)
Oct  7 16:17:34 np0005474864 podman[225800]: 2025-10-07 20:17:34.409131714 +0000 UTC m=+0.097308744 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:17:35 np0005474864 nova_compute[192593]: 2025-10-07 20:17:35.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:35 np0005474864 nova_compute[192593]: 2025-10-07 20:17:35.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:35 np0005474864 nova_compute[192593]: 2025-10-07 20:17:35.479 2 DEBUG nova.network.neutron [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updated VIF entry in instance network info cache for port 33558a12-26ac-4c74-ae48-99d8a83ec581. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:17:35 np0005474864 nova_compute[192593]: 2025-10-07 20:17:35.480 2 DEBUG nova.network.neutron [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updating instance_info_cache with network_info: [{"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:17:35 np0005474864 nova_compute[192593]: 2025-10-07 20:17:35.526 2 DEBUG oslo_concurrency.lockutils [req-0f177f54-75c6-4826-b01e-71bb95b1a6e5 req-3b87e2a8-6117-4add-8c60-ffdebb8e0eae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:17:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:38.567 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:51:dd 2001:db8:0:1:f816:3eff:feb9:51dd 2001:db8::f816:3eff:feb9:51dd'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:feb9:51dd/64 2001:db8::f816:3eff:feb9:51dd/64', 'neutron:device_id': 'ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77bd480d-eb42-45d3-bb40-70369c07639b, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=870e1f98-97fb-4700-aa9c-95d30287b180) old=Port_Binding(mac=['fa:16:3e:b9:51:dd 2001:db8::f816:3eff:feb9:51dd'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feb9:51dd/64', 'neutron:device_id': 'ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:17:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:38.569 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 870e1f98-97fb-4700-aa9c-95d30287b180 in datapath eb8078fd-1d3b-4c15-bb20-fdd51195fe7b updated#033[00m
Oct  7 16:17:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:38.571 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:17:38 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:38.573 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0d527829-80a8-4c76-aebf-8af596b46cea]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:17:38 np0005474864 nova_compute[192593]: 2025-10-07 20:17:38.662 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:39Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:eb:38:05 10.100.0.9
Oct  7 16:17:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:39Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:eb:38:05 10.100.0.9
Oct  7 16:17:40 np0005474864 nova_compute[192593]: 2025-10-07 20:17:40.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:40 np0005474864 podman[225867]: 2025-10-07 20:17:40.386140639 +0000 UTC m=+0.068670749 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3)
Oct  7 16:17:40 np0005474864 podman[225865]: 2025-10-07 20:17:40.407317419 +0000 UTC m=+0.095692827 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  7 16:17:40 np0005474864 podman[225866]: 2025-10-07 20:17:40.475350819 +0000 UTC m=+0.151841545 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:17:43 np0005474864 nova_compute[192593]: 2025-10-07 20:17:43.665 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:45 np0005474864 nova_compute[192593]: 2025-10-07 20:17:45.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:46 np0005474864 podman[225931]: 2025-10-07 20:17:46.399805191 +0000 UTC m=+0.084035792 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Oct  7 16:17:48 np0005474864 nova_compute[192593]: 2025-10-07 20:17:48.705 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:49 np0005474864 podman[225950]: 2025-10-07 20:17:49.391632285 +0000 UTC m=+0.076895966 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:17:50 np0005474864 nova_compute[192593]: 2025-10-07 20:17:50.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:52 np0005474864 podman[225978]: 2025-10-07 20:17:52.41985431 +0000 UTC m=+0.102715430 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  7 16:17:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:17:52Z|00183|binding|INFO|Releasing lport 8ba4b7bf-17cf-40ee-86c3-b763b016c353 from this chassis (sb_readonly=0)
Oct  7 16:17:52 np0005474864 nova_compute[192593]: 2025-10-07 20:17:52.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:53 np0005474864 nova_compute[192593]: 2025-10-07 20:17:53.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:53.865 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:17:53 np0005474864 nova_compute[192593]: 2025-10-07 20:17:53.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:53 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:17:53.868 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:17:55 np0005474864 nova_compute[192593]: 2025-10-07 20:17:55.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:17:58 np0005474864 nova_compute[192593]: 2025-10-07 20:17:58.742 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:00 np0005474864 nova_compute[192593]: 2025-10-07 20:18:00.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:00 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:00.870 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:01 np0005474864 nova_compute[192593]: 2025-10-07 20:18:01.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:03 np0005474864 nova_compute[192593]: 2025-10-07 20:18:03.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:05 np0005474864 nova_compute[192593]: 2025-10-07 20:18:05.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:05 np0005474864 podman[226002]: 2025-10-07 20:18:05.399411482 +0000 UTC m=+0.081847478 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., architecture=x86_64, vendor=Red Hat, Inc., release=1755695350, version=9.6, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  7 16:18:05 np0005474864 podman[226001]: 2025-10-07 20:18:05.430185639 +0000 UTC m=+0.113382577 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:18:08 np0005474864 nova_compute[192593]: 2025-10-07 20:18:08.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:10 np0005474864 nova_compute[192593]: 2025-10-07 20:18:10.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:11 np0005474864 nova_compute[192593]: 2025-10-07 20:18:11.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:11 np0005474864 podman[226048]: 2025-10-07 20:18:11.389591127 +0000 UTC m=+0.068681150 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=multipathd, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:18:11 np0005474864 podman[226046]: 2025-10-07 20:18:11.396352411 +0000 UTC m=+0.083937359 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:18:11 np0005474864 podman[226047]: 2025-10-07 20:18:11.450158531 +0000 UTC m=+0.140121067 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:18:13 np0005474864 nova_compute[192593]: 2025-10-07 20:18:13.750 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:15 np0005474864 nova_compute[192593]: 2025-10-07 20:18:15.276 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:16.192 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:16.193 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:16.194 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:17 np0005474864 podman[226110]: 2025-10-07 20:18:17.380091772 +0000 UTC m=+0.072259623 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Oct  7 16:18:18 np0005474864 nova_compute[192593]: 2025-10-07 20:18:18.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 podman[226129]: 2025-10-07 20:18:20.375999363 +0000 UTC m=+0.058651041 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.402 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.403 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.403 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.403 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.404 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.405 2 INFO nova.compute.manager [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Terminating instance#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.406 2 DEBUG nova.compute.manager [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:18:20 np0005474864 kernel: tap33558a12-26 (unregistering): left promiscuous mode
Oct  7 16:18:20 np0005474864 NetworkManager[51631]: <info>  [1759868300.4313] device (tap33558a12-26): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.450 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:20Z|00184|binding|INFO|Releasing lport 33558a12-26ac-4c74-ae48-99d8a83ec581 from this chassis (sb_readonly=0)
Oct  7 16:18:20 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:20Z|00185|binding|INFO|Setting lport 33558a12-26ac-4c74-ae48-99d8a83ec581 down in Southbound
Oct  7 16:18:20 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:20Z|00186|binding|INFO|Removing iface tap33558a12-26 ovn-installed in OVS
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.456 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:eb:38:05 10.100.0.9'], port_security=['fa:16:3e:eb:38:05 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': 'fb25f45d-8789-4dda-9e61-b950ed2aa282', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4b69ac5dc2b44912af0aa0671c7e3696', 'neutron:revision_number': '4', 'neutron:security_group_ids': '0c85c855-3e12-49e2-82ce-13ef6b154141 cdc3e318-3d5f-4be4-aa5a-39fbe79b4fa3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0ee087c7-bf03-4d87-9556-c31792623fff, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=33558a12-26ac-4c74-ae48-99d8a83ec581) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.458 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 33558a12-26ac-4c74-ae48-99d8a83ec581 in datapath debddd0e-a73a-4300-8608-f32a09aaf5f5 unbound from our chassis#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.461 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network debddd0e-a73a-4300-8608-f32a09aaf5f5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.462 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[03ba73df-490d-4031-81e8-96ac6ef03661]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.466 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5 namespace which is not needed anymore#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.480 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000021.scope: Deactivated successfully.
Oct  7 16:18:20 np0005474864 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000021.scope: Consumed 14.883s CPU time.
Oct  7 16:18:20 np0005474864 systemd-machined[152586]: Machine qemu-11-instance-00000021 terminated.
Oct  7 16:18:20 np0005474864 NetworkManager[51631]: <info>  [1759868300.6367] manager: (tap33558a12-26): new Tun device (/org/freedesktop/NetworkManager/Devices/95)
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.637 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [NOTICE]   (225787) : haproxy version is 2.8.14-c23fe91
Oct  7 16:18:20 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [NOTICE]   (225787) : path to executable is /usr/sbin/haproxy
Oct  7 16:18:20 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [WARNING]  (225787) : Exiting Master process...
Oct  7 16:18:20 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [WARNING]  (225787) : Exiting Master process...
Oct  7 16:18:20 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [ALERT]    (225787) : Current worker (225789) exited with code 143 (Terminated)
Oct  7 16:18:20 np0005474864 neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5[225783]: [WARNING]  (225787) : All workers exited. Exiting... (0)
Oct  7 16:18:20 np0005474864 systemd[1]: libpod-e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711.scope: Deactivated successfully.
Oct  7 16:18:20 np0005474864 podman[226178]: 2025-10-07 20:18:20.668972053 +0000 UTC m=+0.058953910 container died e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.698 2 INFO nova.virt.libvirt.driver [-] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Instance destroyed successfully.#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.699 2 DEBUG nova.objects.instance [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lazy-loading 'resources' on Instance uuid fb25f45d-8789-4dda-9e61-b950ed2aa282 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:18:20 np0005474864 systemd[1]: var-lib-containers-storage-overlay-6da8f18a01750e094d2b48ecdea270840a3b1446974a0f23ee4798e28666f5ba-merged.mount: Deactivated successfully.
Oct  7 16:18:20 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711-userdata-shm.mount: Deactivated successfully.
Oct  7 16:18:20 np0005474864 podman[226178]: 2025-10-07 20:18:20.718198721 +0000 UTC m=+0.108180538 container cleanup e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.719 2 DEBUG nova.virt.libvirt.vif [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:17:19Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-access_point-7354237',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-1464343180-access_point-7354237',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-1464343180-ac',id=33,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL3gm4UcondpF65aN14lK1zwo3U3svklb77Uy+Ar5nFz++nGgl5mwqnzq8MS6J4RXahfIN3NZO7PhCbOaLeLJQFJJuGz0cP20VpVyqPiltkCJyRHWx/0VEeD4dTAkL4r7A==',key_name='tempest-TestSecurityGroupsBasicOps-367207505',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:17:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4b69ac5dc2b44912af0aa0671c7e3696',ramdisk_id='',reservation_id='r-d3z8g0do',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-1464343180',owner_user_name='tempest-TestSecurityGroupsBasicOps-1464343180-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:17:28Z,user_data=None,user_id='ab16a639b2af44c7bc4218a1b1b91068',uuid=fb25f45d-8789-4dda-9e61-b950ed2aa282,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.720 2 DEBUG nova.network.os_vif_util [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converting VIF {"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.188", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.720 2 DEBUG nova.network.os_vif_util [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:eb:38:05,bridge_name='br-int',has_traffic_filtering=True,id=33558a12-26ac-4c74-ae48-99d8a83ec581,network=Network(debddd0e-a73a-4300-8608-f32a09aaf5f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33558a12-26') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.721 2 DEBUG os_vif [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:38:05,bridge_name='br-int',has_traffic_filtering=True,id=33558a12-26ac-4c74-ae48-99d8a83ec581,network=Network(debddd0e-a73a-4300-8608-f32a09aaf5f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33558a12-26') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.723 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap33558a12-26, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 systemd[1]: libpod-conmon-e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711.scope: Deactivated successfully.
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.730 2 INFO os_vif [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:eb:38:05,bridge_name='br-int',has_traffic_filtering=True,id=33558a12-26ac-4c74-ae48-99d8a83ec581,network=Network(debddd0e-a73a-4300-8608-f32a09aaf5f5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap33558a12-26')#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.732 2 INFO nova.virt.libvirt.driver [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Deleting instance files /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282_del#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.733 2 INFO nova.virt.libvirt.driver [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Deletion of /var/lib/nova/instances/fb25f45d-8789-4dda-9e61-b950ed2aa282_del complete#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.788 2 INFO nova.compute.manager [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.789 2 DEBUG oslo.service.loopingcall [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.789 2 DEBUG nova.compute.manager [-] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.789 2 DEBUG nova.network.neutron [-] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:18:20 np0005474864 podman[226218]: 2025-10-07 20:18:20.801199952 +0000 UTC m=+0.053429361 container remove e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.811 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8585004a-48a0-4117-a1b5-c52f9e91d12c]: (4, ('Tue Oct  7 08:18:20 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5 (e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711)\ne49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711\nTue Oct  7 08:18:20 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5 (e49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711)\ne49e8218770336824d43bed8c38321241531bde0e3129a76e67e3ab42db97711\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.813 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1ee70c11-b039-4425-ade1-42a4e62e7f9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.815 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdebddd0e-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:20 np0005474864 kernel: tapdebddd0e-a0: left promiscuous mode
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 nova_compute[192593]: 2025-10-07 20:18:20.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.832 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f40ed290-ded2-4f6f-8529-80ce723d2022]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.872 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7a871a61-7ef8-484f-892e-ce422237c667]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.874 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[521a689e-71d3-4660-93a6-ad272374a69b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.899 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7cbc282f-f6b8-463f-9280-61c4c60cfb7f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 388200, 'reachable_time': 39957, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226233, 'error': None, 'target': 'ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.902 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-debddd0e-a73a-4300-8608-f32a09aaf5f5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:18:20 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:20.903 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[645ecb0a-c8b8-4f51-8230-a655bd78c9c0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:20 np0005474864 systemd[1]: run-netns-ovnmeta\x2ddebddd0e\x2da73a\x2d4300\x2d8608\x2df32a09aaf5f5.mount: Deactivated successfully.
Oct  7 16:18:21 np0005474864 nova_compute[192593]: 2025-10-07 20:18:21.645 2 DEBUG nova.compute.manager [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-changed-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:21 np0005474864 nova_compute[192593]: 2025-10-07 20:18:21.646 2 DEBUG nova.compute.manager [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Refreshing instance network info cache due to event network-changed-33558a12-26ac-4c74-ae48-99d8a83ec581. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:18:21 np0005474864 nova_compute[192593]: 2025-10-07 20:18:21.647 2 DEBUG oslo_concurrency.lockutils [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:18:21 np0005474864 nova_compute[192593]: 2025-10-07 20:18:21.647 2 DEBUG oslo_concurrency.lockutils [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:18:21 np0005474864 nova_compute[192593]: 2025-10-07 20:18:21.648 2 DEBUG nova.network.neutron [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Refreshing network info cache for port 33558a12-26ac-4c74-ae48-99d8a83ec581 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.230 2 DEBUG nova.network.neutron [-] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.275 2 INFO nova.compute.manager [-] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Took 1.49 seconds to deallocate network for instance.#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.354 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.355 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.498 2 DEBUG nova.compute.provider_tree [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.518 2 DEBUG nova.scheduler.client.report [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.544 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.189s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.631 2 INFO nova.scheduler.client.report [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Deleted allocations for instance fb25f45d-8789-4dda-9e61-b950ed2aa282#033[00m
Oct  7 16:18:22 np0005474864 nova_compute[192593]: 2025-10-07 20:18:22.760 2 DEBUG oslo_concurrency.lockutils [None req-23821d9b-7786-4c9f-b487-84b6d4063971 ab16a639b2af44c7bc4218a1b1b91068 4b69ac5dc2b44912af0aa0671c7e3696 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.357s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:23 np0005474864 podman[226234]: 2025-10-07 20:18:23.446502863 +0000 UTC m=+0.131085007 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Oct  7 16:18:23 np0005474864 nova_compute[192593]: 2025-10-07 20:18:23.824 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.023 2 DEBUG nova.network.neutron [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updated VIF entry in instance network info cache for port 33558a12-26ac-4c74-ae48-99d8a83ec581. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.024 2 DEBUG nova.network.neutron [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Updating instance_info_cache with network_info: [{"id": "33558a12-26ac-4c74-ae48-99d8a83ec581", "address": "fa:16:3e:eb:38:05", "network": {"id": "debddd0e-a73a-4300-8608-f32a09aaf5f5", "bridge": "br-int", "label": "tempest-network-smoke--285528793", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "4b69ac5dc2b44912af0aa0671c7e3696", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap33558a12-26", "ovs_interfaceid": "33558a12-26ac-4c74-ae48-99d8a83ec581", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.042 2 DEBUG oslo_concurrency.lockutils [req-78f53fee-970a-4f9a-9053-5298ad0af67d req-567b88b1-2dc2-4b20-a9a8-306adc4e4433 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-fb25f45d-8789-4dda-9e61-b950ed2aa282" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.110 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.130 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.130 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.330 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.331 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5739MB free_disk=73.46341705322266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.331 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.332 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.396 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.397 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.420 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.436 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.462 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.462 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.495 2 DEBUG nova.compute.manager [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-vif-unplugged-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.496 2 DEBUG oslo_concurrency.lockutils [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.496 2 DEBUG oslo_concurrency.lockutils [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.496 2 DEBUG oslo_concurrency.lockutils [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.496 2 DEBUG nova.compute.manager [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] No waiting events found dispatching network-vif-unplugged-33558a12-26ac-4c74-ae48-99d8a83ec581 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.497 2 WARNING nova.compute.manager [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received unexpected event network-vif-unplugged-33558a12-26ac-4c74-ae48-99d8a83ec581 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.497 2 DEBUG nova.compute.manager [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.497 2 DEBUG oslo_concurrency.lockutils [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.497 2 DEBUG oslo_concurrency.lockutils [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.498 2 DEBUG oslo_concurrency.lockutils [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "fb25f45d-8789-4dda-9e61-b950ed2aa282-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.498 2 DEBUG nova.compute.manager [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] No waiting events found dispatching network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.498 2 WARNING nova.compute.manager [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received unexpected event network-vif-plugged-33558a12-26ac-4c74-ae48-99d8a83ec581 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:18:24 np0005474864 nova_compute[192593]: 2025-10-07 20:18:24.498 2 DEBUG nova.compute.manager [req-6242d406-afef-47b0-9092-edfaa717fabc req-82f5ff77-4733-47d3-83c3-bb582fd2eba8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Received event network-vif-deleted-33558a12-26ac-4c74-ae48-99d8a83ec581 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:25 np0005474864 nova_compute[192593]: 2025-10-07 20:18:25.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:25 np0005474864 nova_compute[192593]: 2025-10-07 20:18:25.464 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:25 np0005474864 nova_compute[192593]: 2025-10-07 20:18:25.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:26 np0005474864 nova_compute[192593]: 2025-10-07 20:18:26.441 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:26 np0005474864 nova_compute[192593]: 2025-10-07 20:18:26.442 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:26 np0005474864 nova_compute[192593]: 2025-10-07 20:18:26.886 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:26 np0005474864 nova_compute[192593]: 2025-10-07 20:18:26.887 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:26 np0005474864 nova_compute[192593]: 2025-10-07 20:18:26.916 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.015 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.016 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.025 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.026 2 INFO nova.compute.claims [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.158 2 DEBUG nova.compute.provider_tree [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.175 2 DEBUG nova.scheduler.client.report [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.196 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.181s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.197 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.245 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.246 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.264 2 INFO nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.285 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.376 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.378 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.378 2 INFO nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Creating image(s)#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.379 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "/var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.379 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.380 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.395 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.482 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.484 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.485 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.503 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.589 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.590 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.626 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk 1073741824" returned: 0 in 0.036s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.628 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.628 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.689 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.690 2 DEBUG nova.virt.disk.api [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Checking if we can resize image /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.690 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.749 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.751 2 DEBUG nova.virt.disk.api [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Cannot resize image /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.752 2 DEBUG nova.objects.instance [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'migration_context' on Instance uuid d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.771 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.772 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Ensure instance console log exists: /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.773 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.773 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.774 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:27 np0005474864 nova_compute[192593]: 2025-10-07 20:18:27.895 2 DEBUG nova.policy [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.139 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.140 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.141 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.142 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.158 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  7 16:18:28 np0005474864 nova_compute[192593]: 2025-10-07 20:18:28.826 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:28 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 16:18:28 np0005474864 rsyslogd[1010]: imjournal: journal files changed, reloading...  [v8.2506.0-2.el9 try https://www.rsyslog.com/e/0 ]
Oct  7 16:18:29 np0005474864 nova_compute[192593]: 2025-10-07 20:18:29.404 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Successfully created port: a38146d7-e32f-44a1-8e08-fe0768628fad _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:18:30 np0005474864 nova_compute[192593]: 2025-10-07 20:18:30.014 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Successfully created port: c2c25c19-d77a-413a-a3a7-f46e76d10088 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:18:30 np0005474864 nova_compute[192593]: 2025-10-07 20:18:30.109 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:30 np0005474864 nova_compute[192593]: 2025-10-07 20:18:30.110 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:18:30 np0005474864 nova_compute[192593]: 2025-10-07 20:18:30.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.114 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Successfully updated port: a38146d7-e32f-44a1-8e08-fe0768628fad _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.203 2 DEBUG nova.compute.manager [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-changed-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.204 2 DEBUG nova.compute.manager [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing instance network info cache due to event network-changed-a38146d7-e32f-44a1-8e08-fe0768628fad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.204 2 DEBUG oslo_concurrency.lockutils [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.205 2 DEBUG oslo_concurrency.lockutils [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.205 2 DEBUG nova.network.neutron [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing network info cache for port a38146d7-e32f-44a1-8e08-fe0768628fad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.254 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.255 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:18:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:18:31 np0005474864 nova_compute[192593]: 2025-10-07 20:18:31.703 2 DEBUG nova.network.neutron [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:18:32 np0005474864 nova_compute[192593]: 2025-10-07 20:18:32.136 2 DEBUG nova.network.neutron [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:18:32 np0005474864 nova_compute[192593]: 2025-10-07 20:18:32.154 2 DEBUG oslo_concurrency.lockutils [req-2840b23c-7dc9-42d3-ac29-b66c7889e8ad req-69f6b408-9db9-4770-86b8-9f1158b64451 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:18:32 np0005474864 nova_compute[192593]: 2025-10-07 20:18:32.417 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Successfully updated port: c2c25c19-d77a-413a-a3a7-f46e76d10088 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:18:32 np0005474864 nova_compute[192593]: 2025-10-07 20:18:32.434 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:18:32 np0005474864 nova_compute[192593]: 2025-10-07 20:18:32.434 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquired lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:18:32 np0005474864 nova_compute[192593]: 2025-10-07 20:18:32.435 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:18:32 np0005474864 nova_compute[192593]: 2025-10-07 20:18:32.668 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:18:33 np0005474864 nova_compute[192593]: 2025-10-07 20:18:33.304 2 DEBUG nova.compute.manager [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-changed-c2c25c19-d77a-413a-a3a7-f46e76d10088 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:33 np0005474864 nova_compute[192593]: 2025-10-07 20:18:33.305 2 DEBUG nova.compute.manager [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing instance network info cache due to event network-changed-c2c25c19-d77a-413a-a3a7-f46e76d10088. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:18:33 np0005474864 nova_compute[192593]: 2025-10-07 20:18:33.305 2 DEBUG oslo_concurrency.lockutils [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:18:33 np0005474864 nova_compute[192593]: 2025-10-07 20:18:33.877 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:35 np0005474864 nova_compute[192593]: 2025-10-07 20:18:35.694 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868300.6927795, fb25f45d-8789-4dda-9e61-b950ed2aa282 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:18:35 np0005474864 nova_compute[192593]: 2025-10-07 20:18:35.695 2 INFO nova.compute.manager [-] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:18:35 np0005474864 nova_compute[192593]: 2025-10-07 20:18:35.726 2 DEBUG nova.compute.manager [None req-8785a11f-1a2c-4bde-b1a2-e418b52d2e80 - - - - - -] [instance: fb25f45d-8789-4dda-9e61-b950ed2aa282] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:18:35 np0005474864 nova_compute[192593]: 2025-10-07 20:18:35.731 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:36 np0005474864 nova_compute[192593]: 2025-10-07 20:18:36.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:18:36 np0005474864 podman[226274]: 2025-10-07 20:18:36.447658607 +0000 UTC m=+0.121107890 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:18:36 np0005474864 podman[226275]: 2025-10-07 20:18:36.464642576 +0000 UTC m=+0.134304370 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, name=ubi9-minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, io.openshift.tags=minimal rhel9)
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.463 2 DEBUG nova.network.neutron [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updating instance_info_cache with network_info: [{"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.482 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Releasing lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.482 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Instance network_info: |[{"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.483 2 DEBUG oslo_concurrency.lockutils [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.484 2 DEBUG nova.network.neutron [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing network info cache for port c2c25c19-d77a-413a-a3a7-f46e76d10088 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.490 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Start _get_guest_xml network_info=[{"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.497 2 WARNING nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.508 2 DEBUG nova.virt.libvirt.host [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.509 2 DEBUG nova.virt.libvirt.host [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.513 2 DEBUG nova.virt.libvirt.host [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.513 2 DEBUG nova.virt.libvirt.host [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.515 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.515 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.516 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.517 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.517 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.517 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.518 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.518 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.519 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.519 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.520 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.520 2 DEBUG nova.virt.hardware [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.527 2 DEBUG nova.virt.libvirt.vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1473149966',display_name='tempest-TestGettingAddress-server-1473149966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1473149966',id=37,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMVpVb9IKOXiIyt8QeMXuYNwP1j1ImakzhPiMFUv6kD+qwrPOPgjEH5+EMtRgArxAtP+3pEP1jUuMWiCtbEY3nY5ddQIGkSzLehEX07tKlh1pIlPR598TI6ZAvT1Nh8kgA==',key_name='tempest-TestGettingAddress-2067997596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mb68lu07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:18:27Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.528 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.529 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b2:c3,bridge_name='br-int',has_traffic_filtering=True,id=a38146d7-e32f-44a1-8e08-fe0768628fad,network=Network(5d7c62c6-2d90-46cc-a731-d9564680bd8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38146d7-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.531 2 DEBUG nova.virt.libvirt.vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1473149966',display_name='tempest-TestGettingAddress-server-1473149966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1473149966',id=37,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMVpVb9IKOXiIyt8QeMXuYNwP1j1ImakzhPiMFUv6kD+qwrPOPgjEH5+EMtRgArxAtP+3pEP1jUuMWiCtbEY3nY5ddQIGkSzLehEX07tKlh1pIlPR598TI6ZAvT1Nh8kgA==',key_name='tempest-TestGettingAddress-2067997596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mb68lu07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:18:27Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.531 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.533 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:c4:72,bridge_name='br-int',has_traffic_filtering=True,id=c2c25c19-d77a-413a-a3a7-f46e76d10088,network=Network(eb8078fd-1d3b-4c15-bb20-fdd51195fe7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c25c19-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.534 2 DEBUG nova.objects.instance [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.553 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <uuid>d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb</uuid>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <name>instance-00000025</name>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestGettingAddress-server-1473149966</nova:name>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:18:38</nova:creationTime>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:user uuid="334f092941fc46c496c7def76b2cfe18">tempest-TestGettingAddress-626136673-project-member</nova:user>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:project uuid="2f9bf744045540618c9980fd4a7694f5">tempest-TestGettingAddress-626136673</nova:project>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:port uuid="a38146d7-e32f-44a1-8e08-fe0768628fad">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        <nova:port uuid="c2c25c19-d77a-413a-a3a7-f46e76d10088">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe21:c472" ipVersion="6"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe21:c472" ipVersion="6"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <entry name="serial">d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb</entry>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <entry name="uuid">d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb</entry>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk.config"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:1b:b2:c3"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <target dev="tapa38146d7-e3"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:21:c4:72"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <target dev="tapc2c25c19-d7"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/console.log" append="off"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:18:38 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:18:38 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:18:38 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:18:38 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.556 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Preparing to wait for external event network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.556 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.557 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.557 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.558 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Preparing to wait for external event network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.558 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.558 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.559 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.560 2 DEBUG nova.virt.libvirt.vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1473149966',display_name='tempest-TestGettingAddress-server-1473149966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1473149966',id=37,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMVpVb9IKOXiIyt8QeMXuYNwP1j1ImakzhPiMFUv6kD+qwrPOPgjEH5+EMtRgArxAtP+3pEP1jUuMWiCtbEY3nY5ddQIGkSzLehEX07tKlh1pIlPR598TI6ZAvT1Nh8kgA==',key_name='tempest-TestGettingAddress-2067997596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mb68lu07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:18:27Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.561 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.562 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b2:c3,bridge_name='br-int',has_traffic_filtering=True,id=a38146d7-e32f-44a1-8e08-fe0768628fad,network=Network(5d7c62c6-2d90-46cc-a731-d9564680bd8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38146d7-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.563 2 DEBUG os_vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b2:c3,bridge_name='br-int',has_traffic_filtering=True,id=a38146d7-e32f-44a1-8e08-fe0768628fad,network=Network(5d7c62c6-2d90-46cc-a731-d9564680bd8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38146d7-e3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.564 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.565 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.570 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa38146d7-e3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.571 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapa38146d7-e3, col_values=(('external_ids', {'iface-id': 'a38146d7-e32f-44a1-8e08-fe0768628fad', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1b:b2:c3', 'vm-uuid': 'd24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.574 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:18:38 np0005474864 NetworkManager[51631]: <info>  [1759868318.5796] manager: (tapa38146d7-e3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.592 2 INFO os_vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1b:b2:c3,bridge_name='br-int',has_traffic_filtering=True,id=a38146d7-e32f-44a1-8e08-fe0768628fad,network=Network(5d7c62c6-2d90-46cc-a731-d9564680bd8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38146d7-e3')#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.593 2 DEBUG nova.virt.libvirt.vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1473149966',display_name='tempest-TestGettingAddress-server-1473149966',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1473149966',id=37,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMVpVb9IKOXiIyt8QeMXuYNwP1j1ImakzhPiMFUv6kD+qwrPOPgjEH5+EMtRgArxAtP+3pEP1jUuMWiCtbEY3nY5ddQIGkSzLehEX07tKlh1pIlPR598TI6ZAvT1Nh8kgA==',key_name='tempest-TestGettingAddress-2067997596',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mb68lu07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:18:27Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.594 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.596 2 DEBUG nova.network.os_vif_util [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:c4:72,bridge_name='br-int',has_traffic_filtering=True,id=c2c25c19-d77a-413a-a3a7-f46e76d10088,network=Network(eb8078fd-1d3b-4c15-bb20-fdd51195fe7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c25c19-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.596 2 DEBUG os_vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:c4:72,bridge_name='br-int',has_traffic_filtering=True,id=c2c25c19-d77a-413a-a3a7-f46e76d10088,network=Network(eb8078fd-1d3b-4c15-bb20-fdd51195fe7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c25c19-d7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.597 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.597 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.598 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.601 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.601 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc2c25c19-d7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.602 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc2c25c19-d7, col_values=(('external_ids', {'iface-id': 'c2c25c19-d77a-413a-a3a7-f46e76d10088', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:21:c4:72', 'vm-uuid': 'd24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.603 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 NetworkManager[51631]: <info>  [1759868318.6060] manager: (tapc2c25c19-d7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/97)
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.607 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.614 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.616 2 INFO os_vif [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:c4:72,bridge_name='br-int',has_traffic_filtering=True,id=c2c25c19-d77a-413a-a3a7-f46e76d10088,network=Network(eb8078fd-1d3b-4c15-bb20-fdd51195fe7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c25c19-d7')#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.662 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.663 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.663 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:1b:b2:c3, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.663 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:21:c4:72, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.664 2 INFO nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Using config drive#033[00m
Oct  7 16:18:38 np0005474864 nova_compute[192593]: 2025-10-07 20:18:38.909 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.437 2 INFO nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Creating config drive at /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk.config#033[00m
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.444 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9oehv12p execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.579 2 DEBUG oslo_concurrency.processutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp9oehv12p" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:18:40 np0005474864 kernel: tapa38146d7-e3: entered promiscuous mode
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.6720] manager: (tapa38146d7-e3): new Tun device (/org/freedesktop/NetworkManager/Devices/98)
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00187|binding|INFO|Claiming lport a38146d7-e32f-44a1-8e08-fe0768628fad for this chassis.
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00188|binding|INFO|a38146d7-e32f-44a1-8e08-fe0768628fad: Claiming fa:16:3e:1b:b2:c3 10.100.0.12
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.6970] manager: (tapc2c25c19-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/99)
Oct  7 16:18:40 np0005474864 kernel: tapc2c25c19-d7: entered promiscuous mode
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.703 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.708 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00189|if_status|INFO|Dropped 1 log messages in last 146 seconds (most recently, 146 seconds ago) due to excessive rate
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00190|if_status|INFO|Not updating pb chassis for c2c25c19-d77a-413a-a3a7-f46e76d10088 now as sb is readonly
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.7210] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.7226] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 systemd-udevd[226347]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:18:40 np0005474864 systemd-udevd[226346]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.726 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:b2:c3 10.100.0.12'], port_security=['fa:16:3e:1b:b2:c3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dd2506da-8b62-4c13-b6c9-526ad86198a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=edd326b8-5bac-4160-beff-f2be311fa22f, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=a38146d7-e32f-44a1-8e08-fe0768628fad) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.728 103685 INFO neutron.agent.ovn.metadata.agent [-] Port a38146d7-e32f-44a1-8e08-fe0768628fad in datapath 5d7c62c6-2d90-46cc-a731-d9564680bd8d bound to our chassis#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.729 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5d7c62c6-2d90-46cc-a731-d9564680bd8d#033[00m
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.7437] device (tapc2c25c19-d7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.7469] device (tapc2c25c19-d7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.7510] device (tapa38146d7-e3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.748 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e7999b1b-aa80-46d1-b922-6267fbd828cd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.749 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5d7c62c6-21 in ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.752 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5d7c62c6-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.752 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a8a3b7ac-0fea-4ac9-813e-b9a0cbbe3930]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.754 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[da534ab3-370f-4eb8-9906-483b52165c88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.7550] device (tapa38146d7-e3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:18:40 np0005474864 systemd-machined[152586]: New machine qemu-12-instance-00000025.
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.770 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[3f555730-b6f5-4f69-86fd-32b08ed6f571]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.803 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[38d9018c-2f8b-4c1c-85fb-c4fe9b602365]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.849 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[7d3fafe7-0686-44fa-a49a-4ef3f783c0f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 systemd[1]: Started Virtual Machine qemu-12-instance-00000025.
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.8736] manager: (tap5d7c62c6-20): new Veth device (/org/freedesktop/NetworkManager/Devices/102)
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.873 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2a26387e-3fea-466b-b13f-0364da94c30c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.894 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00191|binding|INFO|Claiming lport c2c25c19-d77a-413a-a3a7-f46e76d10088 for this chassis.
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00192|binding|INFO|c2c25c19-d77a-413a-a3a7-f46e76d10088: Claiming fa:16:3e:21:c4:72 2001:db8:0:1:f816:3eff:fe21:c472 2001:db8::f816:3eff:fe21:c472
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00193|binding|INFO|Setting lport a38146d7-e32f-44a1-8e08-fe0768628fad ovn-installed in OVS
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00194|binding|INFO|Setting lport a38146d7-e32f-44a1-8e08-fe0768628fad up in Southbound
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.923 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.928 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:c4:72 2001:db8:0:1:f816:3eff:fe21:c472 2001:db8::f816:3eff:fe21:c472'], port_security=['fa:16:3e:21:c4:72 2001:db8:0:1:f816:3eff:fe21:c472 2001:db8::f816:3eff:fe21:c472'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe21:c472/64 2001:db8::f816:3eff:fe21:c472/64', 'neutron:device_id': 'd24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dd2506da-8b62-4c13-b6c9-526ad86198a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77bd480d-eb42-45d3-bb40-70369c07639b, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=c2c25c19-d77a-413a-a3a7-f46e76d10088) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.930 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[6d848dc2-09b8-466e-af39-9614c846f745]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.934 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[dcaeeb57-6e52-4d38-89f5-f2870e6e6f3e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00195|binding|INFO|Setting lport c2c25c19-d77a-413a-a3a7-f46e76d10088 ovn-installed in OVS
Oct  7 16:18:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:40Z|00196|binding|INFO|Setting lport c2c25c19-d77a-413a-a3a7-f46e76d10088 up in Southbound
Oct  7 16:18:40 np0005474864 nova_compute[192593]: 2025-10-07 20:18:40.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:40 np0005474864 NetworkManager[51631]: <info>  [1759868320.9763] device (tap5d7c62c6-20): carrier: link connected
Oct  7 16:18:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:40.987 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[faf4db31-5de0-4237-8163-21695e421fae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.020 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[42460e43-ae6e-4a81-bb7b-30294afeb474]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d7c62c6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:c7:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395686, 'reachable_time': 44338, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226383, 'error': None, 'target': 'ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.045 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6c002946-b7f9-48ab-b5aa-10fbb1c7c46e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea0:c72e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395686, 'tstamp': 395686}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226384, 'error': None, 'target': 'ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.074 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0660f1ee-91d1-4229-9396-cd8aeab1b569]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5d7c62c6-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a0:c7:2e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 1, 'rx_bytes': 196, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 57], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395686, 'reachable_time': 44338, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 168, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 168, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226385, 'error': None, 'target': 'ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.125 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[67148e43-c0f7-4f81-b18d-33feb39be268]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.204 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[17b694c6-264d-4019-8298-319126316dd9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.206 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d7c62c6-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.206 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.206 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5d7c62c6-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:41 np0005474864 NetworkManager[51631]: <info>  [1759868321.2455] manager: (tap5d7c62c6-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/103)
Oct  7 16:18:41 np0005474864 kernel: tap5d7c62c6-20: entered promiscuous mode
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.248 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5d7c62c6-20, col_values=(('external_ids', {'iface-id': '134084f8-f559-4220-937b-cb889d06b007'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.249 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:41 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:41Z|00197|binding|INFO|Releasing lport 134084f8-f559-4220-937b-cb889d06b007 from this chassis (sb_readonly=0)
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.253 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5d7c62c6-2d90-46cc-a731-d9564680bd8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5d7c62c6-2d90-46cc-a731-d9564680bd8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.254 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e7ffb2a2-00ec-45ec-8bcd-2c47a391c52a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.255 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-5d7c62c6-2d90-46cc-a731-d9564680bd8d
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/5d7c62c6-2d90-46cc-a731-d9564680bd8d.pid.haproxy
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 5d7c62c6-2d90-46cc-a731-d9564680bd8d
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.256 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'env', 'PROCESS_TAG=haproxy-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5d7c62c6-2d90-46cc-a731-d9564680bd8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:41 np0005474864 podman[226417]: 2025-10-07 20:18:41.666149703 +0000 UTC m=+0.066110606 container create 83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.698 2 DEBUG nova.compute.manager [req-c19ab998-4379-4098-84d1-3502bda9c516 req-8bd05023-e058-43da-91b3-e794de60aebf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.700 2 DEBUG oslo_concurrency.lockutils [req-c19ab998-4379-4098-84d1-3502bda9c516 req-8bd05023-e058-43da-91b3-e794de60aebf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.700 2 DEBUG oslo_concurrency.lockutils [req-c19ab998-4379-4098-84d1-3502bda9c516 req-8bd05023-e058-43da-91b3-e794de60aebf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.701 2 DEBUG oslo_concurrency.lockutils [req-c19ab998-4379-4098-84d1-3502bda9c516 req-8bd05023-e058-43da-91b3-e794de60aebf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:41 np0005474864 nova_compute[192593]: 2025-10-07 20:18:41.702 2 DEBUG nova.compute.manager [req-c19ab998-4379-4098-84d1-3502bda9c516 req-8bd05023-e058-43da-91b3-e794de60aebf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Processing event network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:18:41 np0005474864 podman[226417]: 2025-10-07 20:18:41.62753141 +0000 UTC m=+0.027492353 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:18:41 np0005474864 systemd[1]: Started libpod-conmon-83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528.scope.
Oct  7 16:18:41 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:18:41 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/2f174bdf6d4ad93719ed6bf2d962e3a02079286f559a65f5a622f3e51850debd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:18:41 np0005474864 podman[226417]: 2025-10-07 20:18:41.805618441 +0000 UTC m=+0.205579344 container init 83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Oct  7 16:18:41 np0005474864 podman[226434]: 2025-10-07 20:18:41.809303117 +0000 UTC m=+0.081032846 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  7 16:18:41 np0005474864 podman[226417]: 2025-10-07 20:18:41.811098688 +0000 UTC m=+0.211059551 container start 83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001)
Oct  7 16:18:41 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [NOTICE]   (226499) : New worker (226508) forked
Oct  7 16:18:41 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [NOTICE]   (226499) : Loading success.
Oct  7 16:18:41 np0005474864 podman[226430]: 2025-10-07 20:18:41.840458734 +0000 UTC m=+0.111045670 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:18:41 np0005474864 podman[226431]: 2025-10-07 20:18:41.859300417 +0000 UTC m=+0.131345995 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.894 103685 INFO neutron.agent.ovn.metadata.agent [-] Port c2c25c19-d77a-413a-a3a7-f46e76d10088 in datapath eb8078fd-1d3b-4c15-bb20-fdd51195fe7b unbound from our chassis#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.895 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network eb8078fd-1d3b-4c15-bb20-fdd51195fe7b#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.908 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d76fc3a4-0225-4779-9ca2-3cf988a8ade8]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.909 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapeb8078fd-11 in ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.912 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapeb8078fd-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.912 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c5110580-e123-4731-8b01-d8bcd6cbb072]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.913 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6beb2dc1-c8c9-417f-9a06-aa32e1ff357c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.928 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[f6df224e-d723-4c9f-8d65-edd65ff0aa6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:41 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:41.959 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ef40dcee-29b5-40b6-94f0-ba5af6bf65d5]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.002 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[0ae90449-433f-480e-82b4-b21d84d6e560]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 systemd-udevd[226367]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:18:42 np0005474864 NetworkManager[51631]: <info>  [1759868322.0118] manager: (tapeb8078fd-10): new Veth device (/org/freedesktop/NetworkManager/Devices/104)
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.010 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[65f9f702-1465-4a71-ade7-732bfd164147]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.057 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[b898f809-514a-4843-bb05-9f3b00c0a45e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.060 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[c7fb8244-3f44-4080-a0ed-c6faa74878a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 NetworkManager[51631]: <info>  [1759868322.0920] device (tapeb8078fd-10): carrier: link connected
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.100 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[a806a7a7-c624-4c63-a914-3437f72a811c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.121 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[80784d53-c797-416c-98b3-e6d0e9f81430]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb8078fd-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:51:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395798, 'reachable_time': 22702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226530, 'error': None, 'target': 'ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.149 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9cb8ef-4c09-484b-b8e3-52b690a4f66c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb9:51dd'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395798, 'tstamp': 395798}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226531, 'error': None, 'target': 'ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.174 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3114b622-bc0f-46f1-a72b-8a39d2f5cf6a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapeb8078fd-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b9:51:dd'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395798, 'reachable_time': 22702, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226532, 'error': None, 'target': 'ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.221 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[98dbbca4-97da-4485-9f6f-305ee84bc023]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.274 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4fa39fe8-17c1-4c1a-b4eb-366f078d421c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.276 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb8078fd-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.277 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.278 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapeb8078fd-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:42 np0005474864 NetworkManager[51631]: <info>  [1759868322.3215] manager: (tapeb8078fd-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Oct  7 16:18:42 np0005474864 kernel: tapeb8078fd-10: entered promiscuous mode
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.325 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapeb8078fd-10, col_values=(('external_ids', {'iface-id': '870e1f98-97fb-4700-aa9c-95d30287b180'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:42Z|00198|binding|INFO|Releasing lport 870e1f98-97fb-4700-aa9c-95d30287b180 from this chassis (sb_readonly=0)
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.338 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/eb8078fd-1d3b-4c15-bb20-fdd51195fe7b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/eb8078fd-1d3b-4c15-bb20-fdd51195fe7b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.340 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ae5a71a8-fff3-49a9-acef-946a4a2f3a2b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.340 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/eb8078fd-1d3b-4c15-bb20-fdd51195fe7b.pid.haproxy
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID eb8078fd-1d3b-4c15-bb20-fdd51195fe7b
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:18:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:42.341 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'env', 'PROCESS_TAG=haproxy-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/eb8078fd-1d3b-4c15-bb20-fdd51195fe7b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.359 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868322.3585289, d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.360 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] VM Started (Lifecycle Event)#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.377 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.384 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868322.3597918, d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.385 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.406 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.411 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.414 2 DEBUG nova.network.neutron [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updated VIF entry in instance network info cache for port c2c25c19-d77a-413a-a3a7-f46e76d10088. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.415 2 DEBUG nova.network.neutron [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updating instance_info_cache with network_info: [{"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.443 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:18:42 np0005474864 nova_compute[192593]: 2025-10-07 20:18:42.445 2 DEBUG oslo_concurrency.lockutils [req-b35831cc-551c-4dda-9a3c-eac6b4916bf3 req-5dfc82a0-ea05-48ca-8e0a-d8371c3e560c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:18:42 np0005474864 podman[226563]: 2025-10-07 20:18:42.762214487 +0000 UTC m=+0.063643804 container create b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Oct  7 16:18:42 np0005474864 systemd[1]: Started libpod-conmon-b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1.scope.
Oct  7 16:18:42 np0005474864 podman[226563]: 2025-10-07 20:18:42.730408791 +0000 UTC m=+0.031838088 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:18:42 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:18:42 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ac489d5dd2ef082100874758edd92c83291c590a481cf37e7e09e47c8818ac93/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:18:42 np0005474864 podman[226563]: 2025-10-07 20:18:42.877872129 +0000 UTC m=+0.179301446 container init b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:18:42 np0005474864 podman[226563]: 2025-10-07 20:18:42.889976627 +0000 UTC m=+0.191405914 container start b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  7 16:18:42 np0005474864 neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b[226578]: [NOTICE]   (226582) : New worker (226584) forked
Oct  7 16:18:42 np0005474864 neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b[226578]: [NOTICE]   (226582) : Loading success.
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.220 2 DEBUG nova.compute.manager [req-e0d051db-b1d0-46e7-9d0c-052ffc5fd5fc req-294faea4-8981-4299-9a13-c4533a179f20 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.220 2 DEBUG oslo_concurrency.lockutils [req-e0d051db-b1d0-46e7-9d0c-052ffc5fd5fc req-294faea4-8981-4299-9a13-c4533a179f20 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.221 2 DEBUG oslo_concurrency.lockutils [req-e0d051db-b1d0-46e7-9d0c-052ffc5fd5fc req-294faea4-8981-4299-9a13-c4533a179f20 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.221 2 DEBUG oslo_concurrency.lockutils [req-e0d051db-b1d0-46e7-9d0c-052ffc5fd5fc req-294faea4-8981-4299-9a13-c4533a179f20 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.222 2 DEBUG nova.compute.manager [req-e0d051db-b1d0-46e7-9d0c-052ffc5fd5fc req-294faea4-8981-4299-9a13-c4533a179f20 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Processing event network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.223 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.227 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868323.2273164, d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.228 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.231 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.236 2 INFO nova.virt.libvirt.driver [-] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Instance spawned successfully.#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.237 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.266 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.276 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.281 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.282 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.283 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.284 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.284 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.285 2 DEBUG nova.virt.libvirt.driver [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.319 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.349 2 INFO nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Took 15.97 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.350 2 DEBUG nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.414 2 INFO nova.compute.manager [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Took 16.42 seconds to build instance.#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.432 2 DEBUG oslo_concurrency.lockutils [None req-9074a9e4-fdfa-4b54-adb5-90b5c30de013 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.605 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.818 2 DEBUG nova.compute.manager [req-b4defbe2-963d-4a7e-97f1-8d636ff2985b req-b8aa7d32-b321-4113-a86d-e983417db84f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.819 2 DEBUG oslo_concurrency.lockutils [req-b4defbe2-963d-4a7e-97f1-8d636ff2985b req-b8aa7d32-b321-4113-a86d-e983417db84f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.820 2 DEBUG oslo_concurrency.lockutils [req-b4defbe2-963d-4a7e-97f1-8d636ff2985b req-b8aa7d32-b321-4113-a86d-e983417db84f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.820 2 DEBUG oslo_concurrency.lockutils [req-b4defbe2-963d-4a7e-97f1-8d636ff2985b req-b8aa7d32-b321-4113-a86d-e983417db84f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.821 2 DEBUG nova.compute.manager [req-b4defbe2-963d-4a7e-97f1-8d636ff2985b req-b8aa7d32-b321-4113-a86d-e983417db84f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] No waiting events found dispatching network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.821 2 WARNING nova.compute.manager [req-b4defbe2-963d-4a7e-97f1-8d636ff2985b req-b8aa7d32-b321-4113-a86d-e983417db84f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received unexpected event network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:18:43 np0005474864 nova_compute[192593]: 2025-10-07 20:18:43.911 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:46 np0005474864 nova_compute[192593]: 2025-10-07 20:18:46.845 2 DEBUG nova.compute.manager [req-40fb58a9-2537-4865-bd27-a53860f7d88b req-3fbd04fe-aa66-4374-b88a-5989d0fdbe9a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:46 np0005474864 nova_compute[192593]: 2025-10-07 20:18:46.845 2 DEBUG oslo_concurrency.lockutils [req-40fb58a9-2537-4865-bd27-a53860f7d88b req-3fbd04fe-aa66-4374-b88a-5989d0fdbe9a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:18:46 np0005474864 nova_compute[192593]: 2025-10-07 20:18:46.846 2 DEBUG oslo_concurrency.lockutils [req-40fb58a9-2537-4865-bd27-a53860f7d88b req-3fbd04fe-aa66-4374-b88a-5989d0fdbe9a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:18:46 np0005474864 nova_compute[192593]: 2025-10-07 20:18:46.846 2 DEBUG oslo_concurrency.lockutils [req-40fb58a9-2537-4865-bd27-a53860f7d88b req-3fbd04fe-aa66-4374-b88a-5989d0fdbe9a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:18:46 np0005474864 nova_compute[192593]: 2025-10-07 20:18:46.846 2 DEBUG nova.compute.manager [req-40fb58a9-2537-4865-bd27-a53860f7d88b req-3fbd04fe-aa66-4374-b88a-5989d0fdbe9a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] No waiting events found dispatching network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:18:46 np0005474864 nova_compute[192593]: 2025-10-07 20:18:46.847 2 WARNING nova.compute.manager [req-40fb58a9-2537-4865-bd27-a53860f7d88b req-3fbd04fe-aa66-4374-b88a-5989d0fdbe9a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received unexpected event network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad for instance with vm_state active and task_state None.#033[00m
Oct  7 16:18:48 np0005474864 podman[226593]: 2025-10-07 20:18:48.37184208 +0000 UTC m=+0.069761981 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:18:48 np0005474864 nova_compute[192593]: 2025-10-07 20:18:48.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:48 np0005474864 nova_compute[192593]: 2025-10-07 20:18:48.940 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:48 np0005474864 nova_compute[192593]: 2025-10-07 20:18:48.954 2 DEBUG nova.compute.manager [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-changed-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:18:48 np0005474864 nova_compute[192593]: 2025-10-07 20:18:48.955 2 DEBUG nova.compute.manager [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing instance network info cache due to event network-changed-a38146d7-e32f-44a1-8e08-fe0768628fad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:18:48 np0005474864 nova_compute[192593]: 2025-10-07 20:18:48.955 2 DEBUG oslo_concurrency.lockutils [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:18:48 np0005474864 nova_compute[192593]: 2025-10-07 20:18:48.955 2 DEBUG oslo_concurrency.lockutils [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:18:48 np0005474864 nova_compute[192593]: 2025-10-07 20:18:48.955 2 DEBUG nova.network.neutron [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing network info cache for port a38146d7-e32f-44a1-8e08-fe0768628fad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:18:50 np0005474864 nova_compute[192593]: 2025-10-07 20:18:50.465 2 DEBUG nova.network.neutron [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updated VIF entry in instance network info cache for port a38146d7-e32f-44a1-8e08-fe0768628fad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:18:50 np0005474864 nova_compute[192593]: 2025-10-07 20:18:50.466 2 DEBUG nova.network.neutron [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updating instance_info_cache with network_info: [{"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:18:50 np0005474864 nova_compute[192593]: 2025-10-07 20:18:50.500 2 DEBUG oslo_concurrency.lockutils [req-8b899c91-9d37-4ca8-80f9-8d56ac238f30 req-c80fe46b-8c6d-4b96-86cc-b259a3bd3b64 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:18:51 np0005474864 podman[226613]: 2025-10-07 20:18:51.363784398 +0000 UTC m=+0.061135702 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:18:53 np0005474864 nova_compute[192593]: 2025-10-07 20:18:53.612 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:53 np0005474864 nova_compute[192593]: 2025-10-07 20:18:53.943 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:54 np0005474864 podman[226644]: 2025-10-07 20:18:54.409699589 +0000 UTC m=+0.098897300 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible)
Oct  7 16:18:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:56Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1b:b2:c3 10.100.0.12
Oct  7 16:18:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:18:56Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1b:b2:c3 10.100.0.12
Oct  7 16:18:58 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:58.288 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:18:58 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:18:58.289 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:18:58 np0005474864 nova_compute[192593]: 2025-10-07 20:18:58.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:58 np0005474864 nova_compute[192593]: 2025-10-07 20:18:58.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:18:58 np0005474864 nova_compute[192593]: 2025-10-07 20:18:58.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:02 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:02.291 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:03 np0005474864 nova_compute[192593]: 2025-10-07 20:19:03.620 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:03 np0005474864 nova_compute[192593]: 2025-10-07 20:19:03.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:07 np0005474864 podman[226668]: 2025-10-07 20:19:07.378881603 +0000 UTC m=+0.066331821 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64)
Oct  7 16:19:07 np0005474864 podman[226667]: 2025-10-07 20:19:07.388367497 +0000 UTC m=+0.073664013 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.532 2 DEBUG nova.compute.manager [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-changed-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.533 2 DEBUG nova.compute.manager [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing instance network info cache due to event network-changed-a38146d7-e32f-44a1-8e08-fe0768628fad. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.533 2 DEBUG oslo_concurrency.lockutils [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.534 2 DEBUG oslo_concurrency.lockutils [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.534 2 DEBUG nova.network.neutron [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Refreshing network info cache for port a38146d7-e32f-44a1-8e08-fe0768628fad _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.628 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.628 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.629 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.630 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.630 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.632 2 INFO nova.compute.manager [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Terminating instance#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.634 2 DEBUG nova.compute.manager [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.635 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 kernel: tapa38146d7-e3 (unregistering): left promiscuous mode
Oct  7 16:19:08 np0005474864 NetworkManager[51631]: <info>  [1759868348.6661] device (tapa38146d7-e3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:19:08 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:08Z|00199|binding|INFO|Releasing lport a38146d7-e32f-44a1-8e08-fe0768628fad from this chassis (sb_readonly=0)
Oct  7 16:19:08 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:08Z|00200|binding|INFO|Setting lport a38146d7-e32f-44a1-8e08-fe0768628fad down in Southbound
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.679 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:08Z|00201|binding|INFO|Removing iface tapa38146d7-e3 ovn-installed in OVS
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:08.688 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1b:b2:c3 10.100.0.12'], port_security=['fa:16:3e:1b:b2:c3 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': 'd24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dd2506da-8b62-4c13-b6c9-526ad86198a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=edd326b8-5bac-4160-beff-f2be311fa22f, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=a38146d7-e32f-44a1-8e08-fe0768628fad) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:19:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:08.689 103685 INFO neutron.agent.ovn.metadata.agent [-] Port a38146d7-e32f-44a1-8e08-fe0768628fad in datapath 5d7c62c6-2d90-46cc-a731-d9564680bd8d unbound from our chassis#033[00m
Oct  7 16:19:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:08.691 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5d7c62c6-2d90-46cc-a731-d9564680bd8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:19:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:08.692 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0c447e87-42aa-4274-a8fd-e13c029aa1b5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:08.693 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d namespace which is not needed anymore#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.700 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 kernel: tapc2c25c19-d7 (unregistering): left promiscuous mode
Oct  7 16:19:08 np0005474864 NetworkManager[51631]: <info>  [1759868348.7513] device (tapc2c25c19-d7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.753 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:08Z|00202|binding|INFO|Releasing lport c2c25c19-d77a-413a-a3a7-f46e76d10088 from this chassis (sb_readonly=0)
Oct  7 16:19:08 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:08Z|00203|binding|INFO|Setting lport c2c25c19-d77a-413a-a3a7-f46e76d10088 down in Southbound
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.765 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:08Z|00204|binding|INFO|Removing iface tapc2c25c19-d7 ovn-installed in OVS
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:08.776 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:21:c4:72 2001:db8:0:1:f816:3eff:fe21:c472 2001:db8::f816:3eff:fe21:c472'], port_security=['fa:16:3e:21:c4:72 2001:db8:0:1:f816:3eff:fe21:c472 2001:db8::f816:3eff:fe21:c472'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe21:c472/64 2001:db8::f816:3eff:fe21:c472/64', 'neutron:device_id': 'd24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dd2506da-8b62-4c13-b6c9-526ad86198a5', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=77bd480d-eb42-45d3-bb40-70369c07639b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=c2c25c19-d77a-413a-a3a7-f46e76d10088) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000025.scope: Deactivated successfully.
Oct  7 16:19:08 np0005474864 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000025.scope: Consumed 14.955s CPU time.
Oct  7 16:19:08 np0005474864 systemd-machined[152586]: Machine qemu-12-instance-00000025 terminated.
Oct  7 16:19:08 np0005474864 NetworkManager[51631]: <info>  [1759868348.8724] manager: (tapc2c25c19-d7): new Tun device (/org/freedesktop/NetworkManager/Devices/106)
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.891 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.926 2 INFO nova.virt.libvirt.driver [-] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Instance destroyed successfully.#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.927 2 DEBUG nova.objects.instance [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'resources' on Instance uuid d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.957 2 DEBUG nova.virt.libvirt.vif [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1473149966',display_name='tempest-TestGettingAddress-server-1473149966',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1473149966',id=37,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMVpVb9IKOXiIyt8QeMXuYNwP1j1ImakzhPiMFUv6kD+qwrPOPgjEH5+EMtRgArxAtP+3pEP1jUuMWiCtbEY3nY5ddQIGkSzLehEX07tKlh1pIlPR598TI6ZAvT1Nh8kgA==',key_name='tempest-TestGettingAddress-2067997596',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:18:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mb68lu07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:18:43Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.958 2 DEBUG nova.network.os_vif_util [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.238", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.959 2 DEBUG nova.network.os_vif_util [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1b:b2:c3,bridge_name='br-int',has_traffic_filtering=True,id=a38146d7-e32f-44a1-8e08-fe0768628fad,network=Network(5d7c62c6-2d90-46cc-a731-d9564680bd8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38146d7-e3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.959 2 DEBUG os_vif [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:b2:c3,bridge_name='br-int',has_traffic_filtering=True,id=a38146d7-e32f-44a1-8e08-fe0768628fad,network=Network(5d7c62c6-2d90-46cc-a731-d9564680bd8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38146d7-e3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.961 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa38146d7-e3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.965 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.972 2 INFO os_vif [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1b:b2:c3,bridge_name='br-int',has_traffic_filtering=True,id=a38146d7-e32f-44a1-8e08-fe0768628fad,network=Network(5d7c62c6-2d90-46cc-a731-d9564680bd8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapa38146d7-e3')#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.973 2 DEBUG nova.virt.libvirt.vif [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:18:25Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1473149966',display_name='tempest-TestGettingAddress-server-1473149966',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1473149966',id=37,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBMVpVb9IKOXiIyt8QeMXuYNwP1j1ImakzhPiMFUv6kD+qwrPOPgjEH5+EMtRgArxAtP+3pEP1jUuMWiCtbEY3nY5ddQIGkSzLehEX07tKlh1pIlPR598TI6ZAvT1Nh8kgA==',key_name='tempest-TestGettingAddress-2067997596',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:18:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-mb68lu07',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:18:43Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.973 2 DEBUG nova.network.os_vif_util [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.974 2 DEBUG nova.network.os_vif_util [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:21:c4:72,bridge_name='br-int',has_traffic_filtering=True,id=c2c25c19-d77a-413a-a3a7-f46e76d10088,network=Network(eb8078fd-1d3b-4c15-bb20-fdd51195fe7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c25c19-d7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.975 2 DEBUG os_vif [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:c4:72,bridge_name='br-int',has_traffic_filtering=True,id=c2c25c19-d77a-413a-a3a7-f46e76d10088,network=Network(eb8078fd-1d3b-4c15-bb20-fdd51195fe7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c25c19-d7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.977 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.977 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc2c25c19-d7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.983 2 INFO os_vif [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:21:c4:72,bridge_name='br-int',has_traffic_filtering=True,id=c2c25c19-d77a-413a-a3a7-f46e76d10088,network=Network(eb8078fd-1d3b-4c15-bb20-fdd51195fe7b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapc2c25c19-d7')#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.983 2 INFO nova.virt.libvirt.driver [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Deleting instance files /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb_del#033[00m
Oct  7 16:19:08 np0005474864 nova_compute[192593]: 2025-10-07 20:19:08.984 2 INFO nova.virt.libvirt.driver [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Deletion of /var/lib/nova/instances/d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb_del complete#033[00m
Oct  7 16:19:09 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [NOTICE]   (226499) : haproxy version is 2.8.14-c23fe91
Oct  7 16:19:09 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [NOTICE]   (226499) : path to executable is /usr/sbin/haproxy
Oct  7 16:19:09 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [WARNING]  (226499) : Exiting Master process...
Oct  7 16:19:09 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [WARNING]  (226499) : Exiting Master process...
Oct  7 16:19:09 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [ALERT]    (226499) : Current worker (226508) exited with code 143 (Terminated)
Oct  7 16:19:09 np0005474864 neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d[226463]: [WARNING]  (226499) : All workers exited. Exiting... (0)
Oct  7 16:19:09 np0005474864 systemd[1]: libpod-83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528.scope: Deactivated successfully.
Oct  7 16:19:09 np0005474864 podman[226736]: 2025-10-07 20:19:09.01762987 +0000 UTC m=+0.217602169 container died 83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:19:09 np0005474864 nova_compute[192593]: 2025-10-07 20:19:09.051 2 INFO nova.compute.manager [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Took 0.42 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:19:09 np0005474864 nova_compute[192593]: 2025-10-07 20:19:09.052 2 DEBUG oslo.service.loopingcall [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:19:09 np0005474864 nova_compute[192593]: 2025-10-07 20:19:09.052 2 DEBUG nova.compute.manager [-] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:19:09 np0005474864 nova_compute[192593]: 2025-10-07 20:19:09.052 2 DEBUG nova.network.neutron [-] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:19:09 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528-userdata-shm.mount: Deactivated successfully.
Oct  7 16:19:09 np0005474864 systemd[1]: var-lib-containers-storage-overlay-2f174bdf6d4ad93719ed6bf2d962e3a02079286f559a65f5a622f3e51850debd-merged.mount: Deactivated successfully.
Oct  7 16:19:09 np0005474864 podman[226736]: 2025-10-07 20:19:09.752536379 +0000 UTC m=+0.952508718 container cleanup 83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:19:09 np0005474864 systemd[1]: libpod-conmon-83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528.scope: Deactivated successfully.
Oct  7 16:19:09 np0005474864 podman[226792]: 2025-10-07 20:19:09.960886301 +0000 UTC m=+0.174980281 container remove 83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Oct  7 16:19:09 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:09.969 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ae778478-0f16-4fed-84ff-6943df75a7af]: (4, ('Tue Oct  7 08:19:08 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d (83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528)\n83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528\nTue Oct  7 08:19:09 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d (83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528)\n83279611bf27896614a6b65f607743dd081bbe6653d32451f37a2027f0bc5528\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:09 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:09.972 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6435527a-6432-48e4-b11f-6462c65e8b08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:09 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:09.973 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5d7c62c6-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:09 np0005474864 nova_compute[192593]: 2025-10-07 20:19:09.975 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:09 np0005474864 kernel: tap5d7c62c6-20: left promiscuous mode
Oct  7 16:19:09 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:09.984 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[61bf4187-c5ae-48ff-b452-9a4c088f601e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:09 np0005474864 nova_compute[192593]: 2025-10-07 20:19:09.996 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.012 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[44a2080e-f9ef-46b7-8823-3971a9404411]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.015 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b88ed5fd-96b8-4d4c-9785-f21a2c8f369a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.044 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[06568311-604b-49b0-a8ce-152181c7ed44]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395673, 'reachable_time': 39446, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226808, 'error': None, 'target': 'ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 systemd[1]: run-netns-ovnmeta\x2d5d7c62c6\x2d2d90\x2d46cc\x2da731\x2dd9564680bd8d.mount: Deactivated successfully.
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.047 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5d7c62c6-2d90-46cc-a731-d9564680bd8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.047 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[36260d15-fb44-4c62-804e-f4261f4217da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.052 103685 INFO neutron.agent.ovn.metadata.agent [-] Port c2c25c19-d77a-413a-a3a7-f46e76d10088 in datapath eb8078fd-1d3b-4c15-bb20-fdd51195fe7b unbound from our chassis#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.054 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.056 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e9195ab3-86cb-453f-addf-9f564152041a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.056 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b namespace which is not needed anymore#033[00m
Oct  7 16:19:10 np0005474864 neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b[226578]: [NOTICE]   (226582) : haproxy version is 2.8.14-c23fe91
Oct  7 16:19:10 np0005474864 neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b[226578]: [NOTICE]   (226582) : path to executable is /usr/sbin/haproxy
Oct  7 16:19:10 np0005474864 neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b[226578]: [WARNING]  (226582) : Exiting Master process...
Oct  7 16:19:10 np0005474864 neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b[226578]: [ALERT]    (226582) : Current worker (226584) exited with code 143 (Terminated)
Oct  7 16:19:10 np0005474864 neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b[226578]: [WARNING]  (226582) : All workers exited. Exiting... (0)
Oct  7 16:19:10 np0005474864 systemd[1]: libpod-b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1.scope: Deactivated successfully.
Oct  7 16:19:10 np0005474864 podman[226827]: 2025-10-07 20:19:10.284989577 +0000 UTC m=+0.088065467 container died b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001)
Oct  7 16:19:10 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1-userdata-shm.mount: Deactivated successfully.
Oct  7 16:19:10 np0005474864 systemd[1]: var-lib-containers-storage-overlay-ac489d5dd2ef082100874758edd92c83291c590a481cf37e7e09e47c8818ac93-merged.mount: Deactivated successfully.
Oct  7 16:19:10 np0005474864 podman[226827]: 2025-10-07 20:19:10.322607301 +0000 UTC m=+0.125683181 container cleanup b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:19:10 np0005474864 systemd[1]: libpod-conmon-b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1.scope: Deactivated successfully.
Oct  7 16:19:10 np0005474864 podman[226857]: 2025-10-07 20:19:10.42705278 +0000 UTC m=+0.069049490 container remove b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.437 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[72f5ba69-e5ee-40fb-8722-ccf11a023c70]: (4, ('Tue Oct  7 08:19:10 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b (b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1)\nb2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1\nTue Oct  7 08:19:10 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b (b2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1)\nb2a9cee97a583ba18871b8bf59a8f198b5e150935ca4eacd655b17f1dedf87e1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.439 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[18e9906d-c284-4cf4-80b9-a506eb3f87d4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.440 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapeb8078fd-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:10 np0005474864 kernel: tapeb8078fd-10: left promiscuous mode
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.444 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.467 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.472 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7af8622f-286e-4a76-b1b1-18c96cfc8303]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.500 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9e9765f0-e6c4-480c-8acd-d96fd611469b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.501 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b075be42-d313-468e-b4c1-55ecd0385a75]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.530 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[81c44c67-8863-4260-a09e-839479fb014d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395788, 'reachable_time': 33341, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226872, 'error': None, 'target': 'ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 systemd[1]: run-netns-ovnmeta\x2deb8078fd\x2d1d3b\x2d4c15\x2dbb20\x2dfdd51195fe7b.mount: Deactivated successfully.
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.533 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-eb8078fd-1d3b-4c15-bb20-fdd51195fe7b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:19:10 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:10.533 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[5e229a5e-6748-4f46-8e9b-7a12918420da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.536 2 DEBUG nova.network.neutron [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updated VIF entry in instance network info cache for port a38146d7-e32f-44a1-8e08-fe0768628fad. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.537 2 DEBUG nova.network.neutron [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updating instance_info_cache with network_info: [{"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "address": "fa:16:3e:21:c4:72", "network": {"id": "eb8078fd-1d3b-4c15-bb20-fdd51195fe7b", "bridge": "br-int", "label": "tempest-network-smoke--1066321255", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe21:c472", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc2c25c19-d7", "ovs_interfaceid": "c2c25c19-d77a-413a-a3a7-f46e76d10088", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.583 2 DEBUG oslo_concurrency.lockutils [req-b8b26b75-ac2d-40c2-91c3-5117e0a0c3d2 req-3dcb168b-3aac-4a48-9a5d-14c1e6151791 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.599 2 DEBUG nova.compute.manager [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-unplugged-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.599 2 DEBUG oslo_concurrency.lockutils [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.600 2 DEBUG oslo_concurrency.lockutils [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.600 2 DEBUG oslo_concurrency.lockutils [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.600 2 DEBUG nova.compute.manager [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] No waiting events found dispatching network-vif-unplugged-a38146d7-e32f-44a1-8e08-fe0768628fad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.600 2 DEBUG nova.compute.manager [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-unplugged-a38146d7-e32f-44a1-8e08-fe0768628fad for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.600 2 DEBUG nova.compute.manager [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.601 2 DEBUG oslo_concurrency.lockutils [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.601 2 DEBUG oslo_concurrency.lockutils [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.601 2 DEBUG oslo_concurrency.lockutils [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.601 2 DEBUG nova.compute.manager [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] No waiting events found dispatching network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.601 2 WARNING nova.compute.manager [req-e3ac7a50-c508-4c56-89c9-e08257c51ff2 req-8e7aad07-ccde-41bd-9a7c-6301d2ab56f5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received unexpected event network-vif-plugged-a38146d7-e32f-44a1-8e08-fe0768628fad for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.681 2 DEBUG nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-unplugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.682 2 DEBUG oslo_concurrency.lockutils [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.682 2 DEBUG oslo_concurrency.lockutils [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.683 2 DEBUG oslo_concurrency.lockutils [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.683 2 DEBUG nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] No waiting events found dispatching network-vif-unplugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.683 2 DEBUG nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-unplugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.684 2 DEBUG nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.684 2 DEBUG oslo_concurrency.lockutils [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.684 2 DEBUG oslo_concurrency.lockutils [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.685 2 DEBUG oslo_concurrency.lockutils [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.685 2 DEBUG nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] No waiting events found dispatching network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.685 2 WARNING nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received unexpected event network-vif-plugged-c2c25c19-d77a-413a-a3a7-f46e76d10088 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.686 2 DEBUG nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-deleted-c2c25c19-d77a-413a-a3a7-f46e76d10088 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.686 2 INFO nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Neutron deleted interface c2c25c19-d77a-413a-a3a7-f46e76d10088; detaching it from the instance and deleting it from the info cache#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.686 2 DEBUG nova.network.neutron [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updating instance_info_cache with network_info: [{"id": "a38146d7-e32f-44a1-8e08-fe0768628fad", "address": "fa:16:3e:1b:b2:c3", "network": {"id": "5d7c62c6-2d90-46cc-a731-d9564680bd8d", "bridge": "br-int", "label": "tempest-network-smoke--2117046408", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapa38146d7-e3", "ovs_interfaceid": "a38146d7-e32f-44a1-8e08-fe0768628fad", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:19:10 np0005474864 nova_compute[192593]: 2025-10-07 20:19:10.922 2 DEBUG nova.compute.manager [req-f691af1b-6faf-4678-98e6-311dcf46ee64 req-35bf3cc8-76f3-4a9c-b163-3582fa40b545 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Detach interface failed, port_id=c2c25c19-d77a-413a-a3a7-f46e76d10088, reason: Instance d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.015 2 DEBUG nova.network.neutron [-] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.143 2 INFO nova.compute.manager [-] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Took 2.09 seconds to deallocate network for instance.#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.368 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.369 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.434 2 DEBUG nova.compute.provider_tree [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.496 2 DEBUG nova.scheduler.client.report [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.696 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.326s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:11 np0005474864 nova_compute[192593]: 2025-10-07 20:19:11.795 2 INFO nova.scheduler.client.report [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Deleted allocations for instance d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb#033[00m
Oct  7 16:19:12 np0005474864 nova_compute[192593]: 2025-10-07 20:19:12.021 2 DEBUG oslo_concurrency.lockutils [None req-d1b866a2-5c27-4a23-b34c-6ba369de6d93 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.392s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:12 np0005474864 podman[226875]: 2025-10-07 20:19:12.383539619 +0000 UTC m=+0.074028043 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:19:12 np0005474864 podman[226874]: 2025-10-07 20:19:12.415305244 +0000 UTC m=+0.105769028 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  7 16:19:12 np0005474864 podman[226873]: 2025-10-07 20:19:12.423265843 +0000 UTC m=+0.114407186 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  7 16:19:12 np0005474864 nova_compute[192593]: 2025-10-07 20:19:12.962 2 DEBUG nova.compute.manager [req-6fb6547e-a84b-42ab-a4cc-90080d789593 req-953653b0-635e-44bc-b8bc-409da4b770a4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Received event network-vif-deleted-a38146d7-e32f-44a1-8e08-fe0768628fad external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:13 np0005474864 nova_compute[192593]: 2025-10-07 20:19:13.955 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:13 np0005474864 nova_compute[192593]: 2025-10-07 20:19:13.979 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:16.193 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:16.194 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:16.194 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.474 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.474 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.489 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.558 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.559 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.567 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.567 2 INFO nova.compute.claims [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.699 2 DEBUG nova.compute.provider_tree [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.720 2 DEBUG nova.scheduler.client.report [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.741 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.742 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.812 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.813 2 DEBUG nova.network.neutron [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.836 2 INFO nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.860 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.971 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.972 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.973 2 INFO nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Creating image(s)#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.973 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "/var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.973 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.974 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:17 np0005474864 nova_compute[192593]: 2025-10-07 20:19:17.986 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.082 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.096s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.083 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.083 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.095 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.170 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.172 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.207 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk 1073741824" returned: 0 in 0.035s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.209 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.210 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.253 2 DEBUG nova.policy [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.301 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.303 2 DEBUG nova.virt.disk.api [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Checking if we can resize image /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.303 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.363 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.365 2 DEBUG nova.virt.disk.api [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Cannot resize image /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.366 2 DEBUG nova.objects.instance [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'migration_context' on Instance uuid f32956f8-c207-4bd1-8e80-3da8c5b1b855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.385 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.386 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Ensure instance console log exists: /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.387 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.388 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.388 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.981 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.984 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:18 np0005474864 nova_compute[192593]: 2025-10-07 20:19:18.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.175 2 DEBUG nova.network.neutron [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Successfully updated port: accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.202 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "refresh_cache-f32956f8-c207-4bd1-8e80-3da8c5b1b855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.203 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquired lock "refresh_cache-f32956f8-c207-4bd1-8e80-3da8c5b1b855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.203 2 DEBUG nova.network.neutron [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.276 2 DEBUG nova.compute.manager [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received event network-changed-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.277 2 DEBUG nova.compute.manager [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Refreshing instance network info cache due to event network-changed-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.277 2 DEBUG oslo_concurrency.lockutils [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-f32956f8-c207-4bd1-8e80-3da8c5b1b855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:19:19 np0005474864 podman[226956]: 2025-10-07 20:19:19.388364552 +0000 UTC m=+0.079532332 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  7 16:19:19 np0005474864 nova_compute[192593]: 2025-10-07 20:19:19.397 2 DEBUG nova.network.neutron [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.802 2 DEBUG nova.network.neutron [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Updating instance_info_cache with network_info: [{"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.829 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Releasing lock "refresh_cache-f32956f8-c207-4bd1-8e80-3da8c5b1b855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.829 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Instance network_info: |[{"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.830 2 DEBUG oslo_concurrency.lockutils [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-f32956f8-c207-4bd1-8e80-3da8c5b1b855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.831 2 DEBUG nova.network.neutron [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Refreshing network info cache for port accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.835 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Start _get_guest_xml network_info=[{"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.842 2 WARNING nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.849 2 DEBUG nova.virt.libvirt.host [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.850 2 DEBUG nova.virt.libvirt.host [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.862 2 DEBUG nova.virt.libvirt.host [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.863 2 DEBUG nova.virt.libvirt.host [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.865 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.866 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.867 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.868 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.868 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.869 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.869 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.869 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.870 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.870 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.871 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.871 2 DEBUG nova.virt.hardware [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.878 2 DEBUG nova.virt.libvirt.vif [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:19:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-135876470',display_name='tempest-TestNetworkBasicOps-server-135876470',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-135876470',id=39,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIt1OLBqU97UGpjppu8mgiVJnW5nROWcm7iCx6jDKVY1rgmzACYh6Jd5TeGR3wTJEgdIBUIn8EqQb9kR45Tg521GxEW2tkCgoMk64ZlCdFGe00w0MTZgOAIMT0aYXQNBMg==',key_name='tempest-TestNetworkBasicOps-1948664557',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-j0ja292e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:19:17Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f32956f8-c207-4bd1-8e80-3da8c5b1b855,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.879 2 DEBUG nova.network.os_vif_util [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.880 2 DEBUG nova.network.os_vif_util [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3f:9d,bridge_name='br-int',has_traffic_filtering=True,id=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0,network=Network(d71ace99-df2c-4a61-8bb2-6e66a8f10129),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaccc3f40-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.881 2 DEBUG nova.objects.instance [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'pci_devices' on Instance uuid f32956f8-c207-4bd1-8e80-3da8c5b1b855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.899 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <uuid>f32956f8-c207-4bd1-8e80-3da8c5b1b855</uuid>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <name>instance-00000027</name>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkBasicOps-server-135876470</nova:name>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:19:20</nova:creationTime>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        <nova:port uuid="accc3f40-4f4d-47c4-bad6-6e3102c0f3f0">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <entry name="serial">f32956f8-c207-4bd1-8e80-3da8c5b1b855</entry>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <entry name="uuid">f32956f8-c207-4bd1-8e80-3da8c5b1b855</entry>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk.config"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:c4:3f:9d"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <target dev="tapaccc3f40-4f"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/console.log" append="off"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:19:20 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:19:20 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:19:20 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:19:20 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.901 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Preparing to wait for external event network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.902 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.902 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.903 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.904 2 DEBUG nova.virt.libvirt.vif [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:19:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-135876470',display_name='tempest-TestNetworkBasicOps-server-135876470',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-135876470',id=39,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIt1OLBqU97UGpjppu8mgiVJnW5nROWcm7iCx6jDKVY1rgmzACYh6Jd5TeGR3wTJEgdIBUIn8EqQb9kR45Tg521GxEW2tkCgoMk64ZlCdFGe00w0MTZgOAIMT0aYXQNBMg==',key_name='tempest-TestNetworkBasicOps-1948664557',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-j0ja292e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:19:17Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f32956f8-c207-4bd1-8e80-3da8c5b1b855,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.904 2 DEBUG nova.network.os_vif_util [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.905 2 DEBUG nova.network.os_vif_util [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3f:9d,bridge_name='br-int',has_traffic_filtering=True,id=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0,network=Network(d71ace99-df2c-4a61-8bb2-6e66a8f10129),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaccc3f40-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.906 2 DEBUG os_vif [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3f:9d,bridge_name='br-int',has_traffic_filtering=True,id=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0,network=Network(d71ace99-df2c-4a61-8bb2-6e66a8f10129),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaccc3f40-4f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.907 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.908 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.912 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.913 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaccc3f40-4f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.914 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaccc3f40-4f, col_values=(('external_ids', {'iface-id': 'accc3f40-4f4d-47c4-bad6-6e3102c0f3f0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c4:3f:9d', 'vm-uuid': 'f32956f8-c207-4bd1-8e80-3da8c5b1b855'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.917 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:19:20 np0005474864 NetworkManager[51631]: <info>  [1759868360.9184] manager: (tapaccc3f40-4f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.922 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.923 2 INFO os_vif [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3f:9d,bridge_name='br-int',has_traffic_filtering=True,id=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0,network=Network(d71ace99-df2c-4a61-8bb2-6e66a8f10129),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaccc3f40-4f')#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.982 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.982 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.982 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No VIF found with MAC fa:16:3e:c4:3f:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:19:20 np0005474864 nova_compute[192593]: 2025-10-07 20:19:20.983 2 INFO nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Using config drive#033[00m
Oct  7 16:19:21 np0005474864 nova_compute[192593]: 2025-10-07 20:19:21.283 2 INFO nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Creating config drive at /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk.config#033[00m
Oct  7 16:19:21 np0005474864 nova_compute[192593]: 2025-10-07 20:19:21.289 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp67uusnic execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:21 np0005474864 nova_compute[192593]: 2025-10-07 20:19:21.417 2 DEBUG oslo_concurrency.processutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp67uusnic" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:21 np0005474864 kernel: tapaccc3f40-4f: entered promiscuous mode
Oct  7 16:19:21 np0005474864 NetworkManager[51631]: <info>  [1759868361.5247] manager: (tapaccc3f40-4f): new Tun device (/org/freedesktop/NetworkManager/Devices/108)
Oct  7 16:19:21 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:21Z|00205|binding|INFO|Claiming lport accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 for this chassis.
Oct  7 16:19:21 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:21Z|00206|binding|INFO|accc3f40-4f4d-47c4-bad6-6e3102c0f3f0: Claiming fa:16:3e:c4:3f:9d 10.100.0.3
Oct  7 16:19:21 np0005474864 nova_compute[192593]: 2025-10-07 20:19:21.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.531 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:3f:9d 10.100.0.3'], port_security=['fa:16:3e:c4:3f:9d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1875269861', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f32956f8-c207-4bd1-8e80-3da8c5b1b855', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1875269861', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '7', 'neutron:security_group_ids': 'b761b10b-19d2-4034-8cff-5d7b8a0481c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e20c72c-11cd-4ca5-a6fd-1220fe625e7a, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.534 103685 INFO neutron.agent.ovn.metadata.agent [-] Port accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 in datapath d71ace99-df2c-4a61-8bb2-6e66a8f10129 bound to our chassis#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.537 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d71ace99-df2c-4a61-8bb2-6e66a8f10129#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.555 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[06d89e3e-16cf-4ec6-ad86-1490bc2ced2b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.556 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd71ace99-d1 in ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:19:21 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:21Z|00207|binding|INFO|Setting lport accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 ovn-installed in OVS
Oct  7 16:19:21 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:21Z|00208|binding|INFO|Setting lport accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 up in Southbound
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.560 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd71ace99-d0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.560 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f64ead63-120d-4950-b0a8-6b3e840de103]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.561 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f99f0742-42a4-4e91-a054-3af2ae760567]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 nova_compute[192593]: 2025-10-07 20:19:21.562 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:21 np0005474864 nova_compute[192593]: 2025-10-07 20:19:21.572 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.581 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[c1f741f7-00ff-4b76-bc9c-2da492827ed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 systemd-machined[152586]: New machine qemu-13-instance-00000027.
Oct  7 16:19:21 np0005474864 systemd[1]: Started Virtual Machine qemu-13-instance-00000027.
Oct  7 16:19:21 np0005474864 systemd-udevd[227009]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.614 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0433a63c-d112-4c8d-a8bf-01e90f75883d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 NetworkManager[51631]: <info>  [1759868361.6333] device (tapaccc3f40-4f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:19:21 np0005474864 NetworkManager[51631]: <info>  [1759868361.6345] device (tapaccc3f40-4f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:19:21 np0005474864 podman[226990]: 2025-10-07 20:19:21.657315292 +0000 UTC m=+0.134095484 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.660 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3b117e-8716-430e-ad86-59fcd28b7d86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 NetworkManager[51631]: <info>  [1759868361.6691] manager: (tapd71ace99-d0): new Veth device (/org/freedesktop/NetworkManager/Devices/109)
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.668 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5d6bacb3-9349-4980-888f-71a207854f7e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.716 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[408c6dd5-b6dd-4017-b4f2-a08c9d6199c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.720 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[abe12a30-ae9f-45cf-8aa7-c732c47ba25c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 NetworkManager[51631]: <info>  [1759868361.7526] device (tapd71ace99-d0): carrier: link connected
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.765 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[60d7979c-d898-47f1-88e0-994cb773cc0b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.793 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e9fac677-0eeb-4d29-a3e6-c14590330875]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd71ace99-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:b1:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399764, 'reachable_time': 34215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227051, 'error': None, 'target': 'ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.817 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fd26df0d-e4fb-4fcc-84b5-a6f57f160c98]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe22:b141'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 399764, 'tstamp': 399764}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227052, 'error': None, 'target': 'ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.844 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9a3bad99-d19b-4194-a5c2-adaacc401e74]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd71ace99-d1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:22:b1:41'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399764, 'reachable_time': 34215, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227053, 'error': None, 'target': 'ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:21.890 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[948e9035-02c9-4509-8686-03d165a852ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.000 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f85ee43b-654a-4e6c-a917-661ccf3b49ee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.002 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd71ace99-d0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.002 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.003 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd71ace99-d0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:22 np0005474864 kernel: tapd71ace99-d0: entered promiscuous mode
Oct  7 16:19:22 np0005474864 NetworkManager[51631]: <info>  [1759868362.0079] manager: (tapd71ace99-d0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/110)
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.014 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd71ace99-d0, col_values=(('external_ids', {'iface-id': '890896ef-76b7-49e7-9b74-28f6b583cf27'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:22 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:22Z|00209|binding|INFO|Releasing lport 890896ef-76b7-49e7-9b74-28f6b583cf27 from this chassis (sb_readonly=0)
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.018 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d71ace99-df2c-4a61-8bb2-6e66a8f10129.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d71ace99-df2c-4a61-8bb2-6e66a8f10129.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.020 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0c32ba88-74c7-4c98-85aa-b07a3d9b2da1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.021 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-d71ace99-df2c-4a61-8bb2-6e66a8f10129
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/d71ace99-df2c-4a61-8bb2-6e66a8f10129.pid.haproxy
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID d71ace99-df2c-4a61-8bb2-6e66a8f10129
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:19:22 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:22.022 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'env', 'PROCESS_TAG=haproxy-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d71ace99-df2c-4a61-8bb2-6e66a8f10129.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.032 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:22 np0005474864 podman[227090]: 2025-10-07 20:19:22.50530706 +0000 UTC m=+0.074661462 container create 9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:19:22 np0005474864 systemd[1]: Started libpod-conmon-9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6.scope.
Oct  7 16:19:22 np0005474864 podman[227090]: 2025-10-07 20:19:22.463781633 +0000 UTC m=+0.033136135 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:19:22 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:19:22 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b0b3a2adade3663740d9960ffb29c2c4c3623d673cd2230b1c7629019ad23cfb/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.587 2 DEBUG nova.network.neutron [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Updated VIF entry in instance network info cache for port accc3f40-4f4d-47c4-bad6-6e3102c0f3f0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.588 2 DEBUG nova.network.neutron [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Updating instance_info_cache with network_info: [{"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:19:22 np0005474864 podman[227090]: 2025-10-07 20:19:22.604461766 +0000 UTC m=+0.173816248 container init 9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:19:22 np0005474864 podman[227090]: 2025-10-07 20:19:22.610619903 +0000 UTC m=+0.179974335 container start 9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.616 2 DEBUG oslo_concurrency.lockutils [req-e7cf3f14-22ea-4a3f-94bc-1b26aa45d718 req-50704b20-4777-4515-a288-3644dd104dce 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-f32956f8-c207-4bd1-8e80-3da8c5b1b855" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:19:22 np0005474864 neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129[227106]: [NOTICE]   (227110) : New worker (227112) forked
Oct  7 16:19:22 np0005474864 neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129[227106]: [NOTICE]   (227110) : Loading success.
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.802 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868362.802133, f32956f8-c207-4bd1-8e80-3da8c5b1b855 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.803 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] VM Started (Lifecycle Event)#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.827 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.831 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868362.8024359, f32956f8-c207-4bd1-8e80-3da8c5b1b855 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.831 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.854 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.861 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:19:22 np0005474864 nova_compute[192593]: 2025-10-07 20:19:22.882 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.707 2 DEBUG nova.compute.manager [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received event network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.708 2 DEBUG oslo_concurrency.lockutils [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.709 2 DEBUG oslo_concurrency.lockutils [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.709 2 DEBUG oslo_concurrency.lockutils [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.710 2 DEBUG nova.compute.manager [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Processing event network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.710 2 DEBUG nova.compute.manager [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received event network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.711 2 DEBUG oslo_concurrency.lockutils [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.711 2 DEBUG oslo_concurrency.lockutils [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.712 2 DEBUG oslo_concurrency.lockutils [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.712 2 DEBUG nova.compute.manager [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] No waiting events found dispatching network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.713 2 WARNING nova.compute.manager [req-37c61a3d-c6d2-445b-a190-131a84f8f85c req-e088d98d-8d6f-4031-88cf-c397523355b4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received unexpected event network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.714 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.719 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868363.7183044, f32956f8-c207-4bd1-8e80-3da8c5b1b855 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.720 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.723 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.727 2 INFO nova.virt.libvirt.driver [-] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Instance spawned successfully.#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.728 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.764 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.769 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.769 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.770 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.770 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.771 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.771 2 DEBUG nova.virt.libvirt.driver [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.777 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.821 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.851 2 INFO nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Took 5.88 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.851 2 DEBUG nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.915 2 INFO nova.compute.manager [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Took 6.39 seconds to build instance.#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.924 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868348.9241045, d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.925 2 INFO nova.compute.manager [-] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.951 2 DEBUG nova.compute.manager [None req-ca97a08c-f768-49b9-b92e-66ac2e6046f9 - - - - - -] [instance: d24a1a7c-76dd-49a0-a4c4-1cbcd5e0eabb] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.952 2 DEBUG oslo_concurrency.lockutils [None req-64779b59-0327-4d0e-8ffc-9221d91891a0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.478s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:23 np0005474864 nova_compute[192593]: 2025-10-07 20:19:23.987 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.110 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.207 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.208 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.208 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.209 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.386 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.488 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk --force-share --output=json" returned: 0 in 0.101s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.490 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.583 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855/disk --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.808 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.810 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5656MB free_disk=73.46248245239258GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.810 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:24 np0005474864 nova_compute[192593]: 2025-10-07 20:19:24.811 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.094 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance f32956f8-c207-4bd1-8e80-3da8c5b1b855 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.094 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.095 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.146 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.196 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.227 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.227 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.417s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:25 np0005474864 podman[227129]: 2025-10-07 20:19:25.393359564 +0000 UTC m=+0.081416505 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  7 16:19:25 np0005474864 nova_compute[192593]: 2025-10-07 20:19:25.920 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:26 np0005474864 nova_compute[192593]: 2025-10-07 20:19:26.212 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:26 np0005474864 nova_compute[192593]: 2025-10-07 20:19:26.212 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:26 np0005474864 nova_compute[192593]: 2025-10-07 20:19:26.213 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:26 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:26Z|00210|binding|INFO|Releasing lport 890896ef-76b7-49e7-9b74-28f6b583cf27 from this chassis (sb_readonly=0)
Oct  7 16:19:26 np0005474864 nova_compute[192593]: 2025-10-07 20:19:26.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.735 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.736 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.737 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.738 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.738 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.741 2 INFO nova.compute.manager [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Terminating instance#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.743 2 DEBUG nova.compute.manager [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:19:27 np0005474864 kernel: tapaccc3f40-4f (unregistering): left promiscuous mode
Oct  7 16:19:27 np0005474864 NetworkManager[51631]: <info>  [1759868367.7726] device (tapaccc3f40-4f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:19:27 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:27Z|00211|binding|INFO|Releasing lport accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 from this chassis (sb_readonly=0)
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.780 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:27 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:27Z|00212|binding|INFO|Setting lport accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 down in Southbound
Oct  7 16:19:27 np0005474864 ovn_controller[94801]: 2025-10-07T20:19:27Z|00213|binding|INFO|Removing iface tapaccc3f40-4f ovn-installed in OVS
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.785 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:27 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:27.798 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c4:3f:9d 10.100.0.3'], port_security=['fa:16:3e:c4:3f:9d 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-TestNetworkBasicOps-1875269861', 'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': 'f32956f8-c207-4bd1-8e80-3da8c5b1b855', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-TestNetworkBasicOps-1875269861', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '9', 'neutron:security_group_ids': 'b761b10b-19d2-4034-8cff-5d7b8a0481c0', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.202', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=5e20c72c-11cd-4ca5-a6fd-1220fe625e7a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:19:27 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:27.799 103685 INFO neutron.agent.ovn.metadata.agent [-] Port accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 in datapath d71ace99-df2c-4a61-8bb2-6e66a8f10129 unbound from our chassis#033[00m
Oct  7 16:19:27 np0005474864 nova_compute[192593]: 2025-10-07 20:19:27.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:27 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:27.801 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d71ace99-df2c-4a61-8bb2-6e66a8f10129, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:19:27 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:27.802 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[55313dd8-1e38-4857-91ab-629f9a6d5273]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:27 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:27.803 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129 namespace which is not needed anymore#033[00m
Oct  7 16:19:27 np0005474864 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000027.scope: Deactivated successfully.
Oct  7 16:19:27 np0005474864 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d00000027.scope: Consumed 5.211s CPU time.
Oct  7 16:19:27 np0005474864 systemd-machined[152586]: Machine qemu-13-instance-00000027 terminated.
Oct  7 16:19:27 np0005474864 neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129[227106]: [NOTICE]   (227110) : haproxy version is 2.8.14-c23fe91
Oct  7 16:19:27 np0005474864 neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129[227106]: [NOTICE]   (227110) : path to executable is /usr/sbin/haproxy
Oct  7 16:19:27 np0005474864 neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129[227106]: [WARNING]  (227110) : Exiting Master process...
Oct  7 16:19:27 np0005474864 neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129[227106]: [ALERT]    (227110) : Current worker (227112) exited with code 143 (Terminated)
Oct  7 16:19:27 np0005474864 neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129[227106]: [WARNING]  (227110) : All workers exited. Exiting... (0)
Oct  7 16:19:27 np0005474864 systemd[1]: libpod-9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6.scope: Deactivated successfully.
Oct  7 16:19:27 np0005474864 conmon[227106]: conmon 9b1f1ed7b4ba4b6d6e51 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6.scope/container/memory.events
Oct  7 16:19:28 np0005474864 podman[227175]: 2025-10-07 20:19:28.001812961 +0000 UTC m=+0.082461375 container died 9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.025 2 INFO nova.virt.libvirt.driver [-] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Instance destroyed successfully.#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.026 2 DEBUG nova.objects.instance [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'resources' on Instance uuid f32956f8-c207-4bd1-8e80-3da8c5b1b855 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:19:28 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6-userdata-shm.mount: Deactivated successfully.
Oct  7 16:19:28 np0005474864 systemd[1]: var-lib-containers-storage-overlay-b0b3a2adade3663740d9960ffb29c2c4c3623d673cd2230b1c7629019ad23cfb-merged.mount: Deactivated successfully.
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.042 2 DEBUG nova.virt.libvirt.vif [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:19:16Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-135876470',display_name='tempest-TestNetworkBasicOps-server-135876470',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-135876470',id=39,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIt1OLBqU97UGpjppu8mgiVJnW5nROWcm7iCx6jDKVY1rgmzACYh6Jd5TeGR3wTJEgdIBUIn8EqQb9kR45Tg521GxEW2tkCgoMk64ZlCdFGe00w0MTZgOAIMT0aYXQNBMg==',key_name='tempest-TestNetworkBasicOps-1948664557',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:19:23Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-j0ja292e',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:19:23Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=f32956f8-c207-4bd1-8e80-3da8c5b1b855,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.042 2 DEBUG nova.network.os_vif_util [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "address": "fa:16:3e:c4:3f:9d", "network": {"id": "d71ace99-df2c-4a61-8bb2-6e66a8f10129", "bridge": "br-int", "label": "tempest-network-smoke--188651707", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.202", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaccc3f40-4f", "ovs_interfaceid": "accc3f40-4f4d-47c4-bad6-6e3102c0f3f0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.043 2 DEBUG nova.network.os_vif_util [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3f:9d,bridge_name='br-int',has_traffic_filtering=True,id=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0,network=Network(d71ace99-df2c-4a61-8bb2-6e66a8f10129),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaccc3f40-4f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.044 2 DEBUG os_vif [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3f:9d,bridge_name='br-int',has_traffic_filtering=True,id=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0,network=Network(d71ace99-df2c-4a61-8bb2-6e66a8f10129),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaccc3f40-4f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.046 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaccc3f40-4f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:28 np0005474864 podman[227175]: 2025-10-07 20:19:28.047496037 +0000 UTC m=+0.128144461 container cleanup 9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.052 2 INFO os_vif [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c4:3f:9d,bridge_name='br-int',has_traffic_filtering=True,id=accc3f40-4f4d-47c4-bad6-6e3102c0f3f0,network=Network(d71ace99-df2c-4a61-8bb2-6e66a8f10129),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapaccc3f40-4f')#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.053 2 INFO nova.virt.libvirt.driver [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Deleting instance files /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855_del#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.054 2 INFO nova.virt.libvirt.driver [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Deletion of /var/lib/nova/instances/f32956f8-c207-4bd1-8e80-3da8c5b1b855_del complete#033[00m
Oct  7 16:19:28 np0005474864 systemd[1]: libpod-conmon-9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6.scope: Deactivated successfully.
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.086 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.113 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.114 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.114 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.117 2 INFO nova.compute.manager [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.118 2 DEBUG oslo.service.loopingcall [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.119 2 DEBUG nova.compute.manager [-] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.119 2 DEBUG nova.network.neutron [-] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.134 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Skipping network cache update for instance because it is being deleted. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9875#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.135 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:19:28 np0005474864 podman[227219]: 2025-10-07 20:19:28.135886252 +0000 UTC m=+0.052805612 container remove 9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.145 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2cdc143c-4587-4ba6-969c-a8ce0a76b11c]: (4, ('Tue Oct  7 08:19:27 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129 (9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6)\n9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6\nTue Oct  7 08:19:28 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129 (9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6)\n9b1f1ed7b4ba4b6d6e513f0436a128cd030b6c616c45c33c2b97a655e96a29f6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.147 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6ef61957-7983-4d14-a53b-481feec95cf9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.149 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd71ace99-d0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:19:28 np0005474864 kernel: tapd71ace99-d0: left promiscuous mode
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.159 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5a99b169-0ddf-4f4c-8431-5f0c8237147b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.167 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.192 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[9d7a1010-54a2-4378-9d87-46ced9391d8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.194 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[aa2ae3ea-beb7-4c23-8684-06b2f0dd6d87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.219 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c3613e4b-e537-454b-90fc-91ffc343955d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 399754, 'reachable_time': 25621, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227234, 'error': None, 'target': 'ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:28 np0005474864 systemd[1]: run-netns-ovnmeta\x2dd71ace99\x2ddf2c\x2d4a61\x2d8bb2\x2d6e66a8f10129.mount: Deactivated successfully.
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.225 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d71ace99-df2c-4a61-8bb2-6e66a8f10129 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:19:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:28.225 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[2e8d99fa-d1aa-45f6-bc9b-ee934b2d1f05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.289 2 DEBUG nova.compute.manager [req-06412d75-c985-4a20-a2dd-47eabe324110 req-c9163a41-485f-40d7-b8c1-546bed46bc96 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received event network-vif-unplugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.289 2 DEBUG oslo_concurrency.lockutils [req-06412d75-c985-4a20-a2dd-47eabe324110 req-c9163a41-485f-40d7-b8c1-546bed46bc96 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.290 2 DEBUG oslo_concurrency.lockutils [req-06412d75-c985-4a20-a2dd-47eabe324110 req-c9163a41-485f-40d7-b8c1-546bed46bc96 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.290 2 DEBUG oslo_concurrency.lockutils [req-06412d75-c985-4a20-a2dd-47eabe324110 req-c9163a41-485f-40d7-b8c1-546bed46bc96 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.291 2 DEBUG nova.compute.manager [req-06412d75-c985-4a20-a2dd-47eabe324110 req-c9163a41-485f-40d7-b8c1-546bed46bc96 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] No waiting events found dispatching network-vif-unplugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.291 2 DEBUG nova.compute.manager [req-06412d75-c985-4a20-a2dd-47eabe324110 req-c9163a41-485f-40d7-b8c1-546bed46bc96 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received event network-vif-unplugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.858 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:28 np0005474864 nova_compute[192593]: 2025-10-07 20:19:28.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.218 2 DEBUG nova.network.neutron [-] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.264 2 INFO nova.compute.manager [-] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Took 1.15 seconds to deallocate network for instance.#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.326 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.327 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.386 2 DEBUG nova.compute.provider_tree [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.405 2 DEBUG nova.scheduler.client.report [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.427 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.458 2 INFO nova.scheduler.client.report [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Deleted allocations for instance f32956f8-c207-4bd1-8e80-3da8c5b1b855#033[00m
Oct  7 16:19:29 np0005474864 nova_compute[192593]: 2025-10-07 20:19:29.554 2 DEBUG oslo_concurrency.lockutils [None req-6805d064-eb0f-4510-86f0-bc0c2be966c0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.818s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:30 np0005474864 nova_compute[192593]: 2025-10-07 20:19:30.384 2 DEBUG nova.compute.manager [req-a4bc534e-b0cb-41ff-afb7-f33089304ffe req-b57b399c-3b42-4019-a93a-a2d9066acf40 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received event network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:19:30 np0005474864 nova_compute[192593]: 2025-10-07 20:19:30.384 2 DEBUG oslo_concurrency.lockutils [req-a4bc534e-b0cb-41ff-afb7-f33089304ffe req-b57b399c-3b42-4019-a93a-a2d9066acf40 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:19:30 np0005474864 nova_compute[192593]: 2025-10-07 20:19:30.385 2 DEBUG oslo_concurrency.lockutils [req-a4bc534e-b0cb-41ff-afb7-f33089304ffe req-b57b399c-3b42-4019-a93a-a2d9066acf40 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:19:30 np0005474864 nova_compute[192593]: 2025-10-07 20:19:30.385 2 DEBUG oslo_concurrency.lockutils [req-a4bc534e-b0cb-41ff-afb7-f33089304ffe req-b57b399c-3b42-4019-a93a-a2d9066acf40 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "f32956f8-c207-4bd1-8e80-3da8c5b1b855-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:19:30 np0005474864 nova_compute[192593]: 2025-10-07 20:19:30.385 2 DEBUG nova.compute.manager [req-a4bc534e-b0cb-41ff-afb7-f33089304ffe req-b57b399c-3b42-4019-a93a-a2d9066acf40 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] No waiting events found dispatching network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:19:30 np0005474864 nova_compute[192593]: 2025-10-07 20:19:30.386 2 WARNING nova.compute.manager [req-a4bc534e-b0cb-41ff-afb7-f33089304ffe req-b57b399c-3b42-4019-a93a-a2d9066acf40 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Received unexpected event network-vif-plugged-accc3f40-4f4d-47c4-bad6-6e3102c0f3f0 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:19:31 np0005474864 nova_compute[192593]: 2025-10-07 20:19:31.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:31 np0005474864 nova_compute[192593]: 2025-10-07 20:19:31.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:19:31 np0005474864 nova_compute[192593]: 2025-10-07 20:19:31.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:19:33 np0005474864 nova_compute[192593]: 2025-10-07 20:19:33.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:33 np0005474864 nova_compute[192593]: 2025-10-07 20:19:33.990 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:38 np0005474864 nova_compute[192593]: 2025-10-07 20:19:38.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:38 np0005474864 podman[227235]: 2025-10-07 20:19:38.386123056 +0000 UTC m=+0.070771669 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:19:38 np0005474864 podman[227236]: 2025-10-07 20:19:38.407709607 +0000 UTC m=+0.093830783 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, release=1755695350, architecture=x86_64, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:19:39 np0005474864 nova_compute[192593]: 2025-10-07 20:19:39.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:43 np0005474864 nova_compute[192593]: 2025-10-07 20:19:43.024 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868368.0223465, f32956f8-c207-4bd1-8e80-3da8c5b1b855 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:19:43 np0005474864 nova_compute[192593]: 2025-10-07 20:19:43.025 2 INFO nova.compute.manager [-] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:19:43 np0005474864 nova_compute[192593]: 2025-10-07 20:19:43.052 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:43 np0005474864 nova_compute[192593]: 2025-10-07 20:19:43.066 2 DEBUG nova.compute.manager [None req-d1fc5bf3-27ac-43a6-878d-c1b6d141d1c9 - - - - - -] [instance: f32956f8-c207-4bd1-8e80-3da8c5b1b855] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:19:43 np0005474864 podman[227282]: 2025-10-07 20:19:43.4248534 +0000 UTC m=+0.096429428 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  7 16:19:43 np0005474864 podman[227280]: 2025-10-07 20:19:43.438439671 +0000 UTC m=+0.128344857 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:19:43 np0005474864 podman[227281]: 2025-10-07 20:19:43.456013257 +0000 UTC m=+0.133315490 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_controller)
Oct  7 16:19:44 np0005474864 nova_compute[192593]: 2025-10-07 20:19:44.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:48 np0005474864 nova_compute[192593]: 2025-10-07 20:19:48.055 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:49 np0005474864 nova_compute[192593]: 2025-10-07 20:19:49.018 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:50 np0005474864 podman[227348]: 2025-10-07 20:19:50.401625507 +0000 UTC m=+0.091091764 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  7 16:19:52 np0005474864 podman[227367]: 2025-10-07 20:19:52.406093464 +0000 UTC m=+0.088740736 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:19:53 np0005474864 nova_compute[192593]: 2025-10-07 20:19:53.058 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:54 np0005474864 nova_compute[192593]: 2025-10-07 20:19:54.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:56 np0005474864 podman[227391]: 2025-10-07 20:19:56.356324316 +0000 UTC m=+0.052006718 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:19:58 np0005474864 nova_compute[192593]: 2025-10-07 20:19:58.060 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:58 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:58.789 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:19:58 np0005474864 nova_compute[192593]: 2025-10-07 20:19:58.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:58 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:58.790 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:19:59 np0005474864 nova_compute[192593]: 2025-10-07 20:19:59.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:19:59 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:19:59.792 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:03 np0005474864 nova_compute[192593]: 2025-10-07 20:20:03.062 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.025 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.141 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.142 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.238 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.488 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.488 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.495 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.496 2 INFO nova.compute.claims [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.749 2 DEBUG nova.compute.provider_tree [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:20:04 np0005474864 nova_compute[192593]: 2025-10-07 20:20:04.813 2 DEBUG nova.scheduler.client.report [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.062 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.574s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.063 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.221 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.222 2 DEBUG nova.network.neutron [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.306 2 INFO nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.371 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.444 2 DEBUG nova.policy [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.618 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.620 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.621 2 INFO nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Creating image(s)#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.622 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "/var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.622 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.624 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "/var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.651 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.726 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.728 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.729 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.753 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.814 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.816 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.858 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.860 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.131s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.861 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.937 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.939 2 DEBUG nova.virt.disk.api [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Checking if we can resize image /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:20:05 np0005474864 nova_compute[192593]: 2025-10-07 20:20:05.941 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.009 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.010 2 DEBUG nova.virt.disk.api [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Cannot resize image /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.011 2 DEBUG nova.objects.instance [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'migration_context' on Instance uuid 30de244b-c8b3-47e1-99a2-f00752af916f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.179 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.180 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Ensure instance console log exists: /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.181 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.181 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:06 np0005474864 nova_compute[192593]: 2025-10-07 20:20:06.181 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:07 np0005474864 nova_compute[192593]: 2025-10-07 20:20:07.094 2 DEBUG nova.network.neutron [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Successfully created port: 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:20:08 np0005474864 nova_compute[192593]: 2025-10-07 20:20:08.068 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:08 np0005474864 nova_compute[192593]: 2025-10-07 20:20:08.860 2 DEBUG nova.network.neutron [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Successfully updated port: 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:20:08 np0005474864 nova_compute[192593]: 2025-10-07 20:20:08.986 2 DEBUG nova.compute.manager [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-changed-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:08 np0005474864 nova_compute[192593]: 2025-10-07 20:20:08.986 2 DEBUG nova.compute.manager [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Refreshing instance network info cache due to event network-changed-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:20:08 np0005474864 nova_compute[192593]: 2025-10-07 20:20:08.987 2 DEBUG oslo_concurrency.lockutils [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:08 np0005474864 nova_compute[192593]: 2025-10-07 20:20:08.987 2 DEBUG oslo_concurrency.lockutils [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:08 np0005474864 nova_compute[192593]: 2025-10-07 20:20:08.987 2 DEBUG nova.network.neutron [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Refreshing network info cache for port 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:20:09 np0005474864 nova_compute[192593]: 2025-10-07 20:20:09.028 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:09 np0005474864 nova_compute[192593]: 2025-10-07 20:20:09.121 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:09 np0005474864 podman[227425]: 2025-10-07 20:20:09.375735838 +0000 UTC m=+0.071114939 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:20:09 np0005474864 podman[227426]: 2025-10-07 20:20:09.389230517 +0000 UTC m=+0.081593511 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, distribution-scope=public, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vendor=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, version=9.6, io.buildah.version=1.33.7, architecture=x86_64)
Oct  7 16:20:09 np0005474864 nova_compute[192593]: 2025-10-07 20:20:09.502 2 DEBUG nova.network.neutron [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:20:09 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:09Z|00214|memory_trim|INFO|Detected inactivity (last active 30003 ms ago): trimming memory
Oct  7 16:20:09 np0005474864 nova_compute[192593]: 2025-10-07 20:20:09.975 2 DEBUG nova.network.neutron [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:09 np0005474864 nova_compute[192593]: 2025-10-07 20:20:09.994 2 DEBUG oslo_concurrency.lockutils [req-b6cb5c00-1d82-4865-a2cd-ccaa8be7c849 req-c30522e3-161d-4a02-8f3e-447969758096 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:09 np0005474864 nova_compute[192593]: 2025-10-07 20:20:09.995 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquired lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:09 np0005474864 nova_compute[192593]: 2025-10-07 20:20:09.995 2 DEBUG nova.network.neutron [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:20:10 np0005474864 nova_compute[192593]: 2025-10-07 20:20:10.514 2 DEBUG nova.network.neutron [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.789 2 DEBUG nova.network.neutron [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updating instance_info_cache with network_info: [{"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.821 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Releasing lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.821 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Instance network_info: |[{"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.827 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Start _get_guest_xml network_info=[{"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.836 2 WARNING nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.841 2 DEBUG nova.virt.libvirt.host [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.842 2 DEBUG nova.virt.libvirt.host [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.847 2 DEBUG nova.virt.libvirt.host [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.848 2 DEBUG nova.virt.libvirt.host [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.850 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.850 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.851 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.852 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.852 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.853 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.853 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.854 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.855 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.855 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.855 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.856 2 DEBUG nova.virt.hardware [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.862 2 DEBUG nova.virt.libvirt.vif [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:20:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1488044028',display_name='tempest-TestNetworkBasicOps-server-1488044028',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1488044028',id=42,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3fP+kEZm1BDnVq1jZ5StbwasJe3y53EdxGsaTK8aISqUdvl2VgCBatFl3aTna8qxy93lplQmDnHOkiqmSZMOoitAgysFHYmhH01/JGskYdF7QWmUbGmk7TM9O9Qc7FbA==',key_name='tempest-TestNetworkBasicOps-509323753',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-zf35ijy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:20:05Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=30de244b-c8b3-47e1-99a2-f00752af916f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.863 2 DEBUG nova.network.os_vif_util [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.864 2 DEBUG nova.network.os_vif_util [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:48:ff,bridge_name='br-int',has_traffic_filtering=True,id=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503,network=Network(e65ee0da-6c97-4834-a9da-4a86620baf5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d52d9d8-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.866 2 DEBUG nova.objects.instance [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'pci_devices' on Instance uuid 30de244b-c8b3-47e1-99a2-f00752af916f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.881 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <uuid>30de244b-c8b3-47e1-99a2-f00752af916f</uuid>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <name>instance-0000002a</name>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkBasicOps-server-1488044028</nova:name>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:20:11</nova:creationTime>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:user uuid="fde8db13cdde4728903e9d2749f853e1">tempest-TestNetworkBasicOps-666319938-project-member</nova:user>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:project uuid="57491b24c6b2419c842483a87c8b4d42">tempest-TestNetworkBasicOps-666319938</nova:project>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        <nova:port uuid="9d52d9d8-4162-4c1a-a1d3-e9539bb3c503">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <entry name="serial">30de244b-c8b3-47e1-99a2-f00752af916f</entry>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <entry name="uuid">30de244b-c8b3-47e1-99a2-f00752af916f</entry>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk.config"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:43:48:ff"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <target dev="tap9d52d9d8-41"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/console.log" append="off"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:20:11 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:20:11 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:20:11 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:20:11 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.883 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Preparing to wait for external event network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.884 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.884 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.885 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.886 2 DEBUG nova.virt.libvirt.vif [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:20:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1488044028',display_name='tempest-TestNetworkBasicOps-server-1488044028',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1488044028',id=42,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3fP+kEZm1BDnVq1jZ5StbwasJe3y53EdxGsaTK8aISqUdvl2VgCBatFl3aTna8qxy93lplQmDnHOkiqmSZMOoitAgysFHYmhH01/JGskYdF7QWmUbGmk7TM9O9Qc7FbA==',key_name='tempest-TestNetworkBasicOps-509323753',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-zf35ijy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:20:05Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=30de244b-c8b3-47e1-99a2-f00752af916f,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.887 2 DEBUG nova.network.os_vif_util [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.887 2 DEBUG nova.network.os_vif_util [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:43:48:ff,bridge_name='br-int',has_traffic_filtering=True,id=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503,network=Network(e65ee0da-6c97-4834-a9da-4a86620baf5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d52d9d8-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.888 2 DEBUG os_vif [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:48:ff,bridge_name='br-int',has_traffic_filtering=True,id=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503,network=Network(e65ee0da-6c97-4834-a9da-4a86620baf5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d52d9d8-41') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.889 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.890 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.890 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.895 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.895 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap9d52d9d8-41, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.896 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap9d52d9d8-41, col_values=(('external_ids', {'iface-id': '9d52d9d8-4162-4c1a-a1d3-e9539bb3c503', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:43:48:ff', 'vm-uuid': '30de244b-c8b3-47e1-99a2-f00752af916f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.899 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:11 np0005474864 NetworkManager[51631]: <info>  [1759868411.9009] manager: (tap9d52d9d8-41): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.908 2 INFO os_vif [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:43:48:ff,bridge_name='br-int',has_traffic_filtering=True,id=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503,network=Network(e65ee0da-6c97-4834-a9da-4a86620baf5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d52d9d8-41')#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.984 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.985 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.985 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] No VIF found with MAC fa:16:3e:43:48:ff, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:20:11 np0005474864 nova_compute[192593]: 2025-10-07 20:20:11.986 2 INFO nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Using config drive#033[00m
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.352 2 INFO nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Creating config drive at /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk.config#033[00m
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.360 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdqtmzgp_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.491 2 DEBUG oslo_concurrency.processutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdqtmzgp_" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:12 np0005474864 kernel: tap9d52d9d8-41: entered promiscuous mode
Oct  7 16:20:12 np0005474864 NetworkManager[51631]: <info>  [1759868412.5809] manager: (tap9d52d9d8-41): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.582 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:12 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:12Z|00215|binding|INFO|Claiming lport 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 for this chassis.
Oct  7 16:20:12 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:12Z|00216|binding|INFO|9d52d9d8-4162-4c1a-a1d3-e9539bb3c503: Claiming fa:16:3e:43:48:ff 10.100.0.10
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.592 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.616 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:48:ff 10.100.0.10'], port_security=['fa:16:3e:43:48:ff 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '2', 'neutron:security_group_ids': '20f1f1d1-1339-4926-afbc-e9bfffee82bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e52e3f0-bafc-4319-a52a-27ab7ae25fb4, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.617 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 in datapath e65ee0da-6c97-4834-a9da-4a86620baf5d bound to our chassis#033[00m
Oct  7 16:20:12 np0005474864 systemd-udevd[227489]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.620 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e65ee0da-6c97-4834-a9da-4a86620baf5d#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.639 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[29ff477f-1784-4793-8345-95fa7c8e5e84]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.641 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape65ee0da-61 in ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.643 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape65ee0da-60 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.643 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2c3818a0-69b5-45b6-8f98-a88599bdabf4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 systemd-machined[152586]: New machine qemu-14-instance-0000002a.
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.645 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a01d2fe9-d66a-409d-b80f-59947065542e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 NetworkManager[51631]: <info>  [1759868412.6486] device (tap9d52d9d8-41): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:20:12 np0005474864 NetworkManager[51631]: <info>  [1759868412.6500] device (tap9d52d9d8-41): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.649 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:12 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:12Z|00217|binding|INFO|Setting lport 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 ovn-installed in OVS
Oct  7 16:20:12 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:12Z|00218|binding|INFO|Setting lport 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 up in Southbound
Oct  7 16:20:12 np0005474864 nova_compute[192593]: 2025-10-07 20:20:12.653 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:12 np0005474864 systemd[1]: Started Virtual Machine qemu-14-instance-0000002a.
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.661 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[f110d7a2-6c50-40af-af66-ffd990c8f539]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.681 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[92615225-7edd-450d-b95f-566af31361ee]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.729 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[3f6eb3d0-ba03-4382-9647-2c26f76ca663]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.735 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c145f841-01e1-4add-9a5b-f5c7910d3a93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 NetworkManager[51631]: <info>  [1759868412.7371] manager: (tape65ee0da-60): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Oct  7 16:20:12 np0005474864 systemd-udevd[227494]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.792 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[360235cf-0e9a-480e-93cf-b83868905d1a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.797 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[e5672c0c-61e1-4d51-b8eb-36360fe74724]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 NetworkManager[51631]: <info>  [1759868412.8328] device (tape65ee0da-60): carrier: link connected
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.843 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[0d384197-9cc7-4acf-94fe-70859b4cebb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.870 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ea0685ac-1ef0-4cb6-99a2-f1464c4e5e81]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65ee0da-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:01:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404872, 'reachable_time': 34539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227524, 'error': None, 'target': 'ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.892 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[386260b8-0acb-4516-baf7-e402dbd72ed7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7a:15d'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 404872, 'tstamp': 404872}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 227525, 'error': None, 'target': 'ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.920 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[534dbd24-a49a-42f3-b328-0419b883e4ad]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape65ee0da-61'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7a:01:5d'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 65], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404872, 'reachable_time': 34539, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 227526, 'error': None, 'target': 'ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:12 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:12.974 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7703ca08-9826-4e24-9c39-a05f3ec37f1c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.068 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[df09c1dc-29ff-499d-9ade-39a6f1016f7c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.070 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65ee0da-60, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.070 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.071 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape65ee0da-60, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:13 np0005474864 kernel: tape65ee0da-60: entered promiscuous mode
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:13 np0005474864 NetworkManager[51631]: <info>  [1759868413.0783] manager: (tape65ee0da-60): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.084 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape65ee0da-60, col_values=(('external_ids', {'iface-id': '896ce339-edeb-4cb5-8a18-42b2beaa06e9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:13 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:13Z|00219|binding|INFO|Releasing lport 896ce339-edeb-4cb5-8a18-42b2beaa06e9 from this chassis (sb_readonly=0)
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.087 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.088 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e65ee0da-6c97-4834-a9da-4a86620baf5d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e65ee0da-6c97-4834-a9da-4a86620baf5d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.089 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c71fd992-5929-4d41-97bc-5a82c3817cb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.090 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-e65ee0da-6c97-4834-a9da-4a86620baf5d
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/e65ee0da-6c97-4834-a9da-4a86620baf5d.pid.haproxy
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID e65ee0da-6c97-4834-a9da-4a86620baf5d
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:20:13 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:13.092 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'env', 'PROCESS_TAG=haproxy-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e65ee0da-6c97-4834-a9da-4a86620baf5d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.102 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:13 np0005474864 podman[227564]: 2025-10-07 20:20:13.50306328 +0000 UTC m=+0.053941894 container create 26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:20:13 np0005474864 systemd[1]: Started libpod-conmon-26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e.scope.
Oct  7 16:20:13 np0005474864 podman[227564]: 2025-10-07 20:20:13.475708222 +0000 UTC m=+0.026586836 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:20:13 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:20:13 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/97fb75e5a9bb2b2a812cae0581a54ae7d4478b9ab6d7ad65eb0260dc712ae429/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:20:13 np0005474864 podman[227564]: 2025-10-07 20:20:13.617093093 +0000 UTC m=+0.167971727 container init 26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:20:13 np0005474864 podman[227581]: 2025-10-07 20:20:13.62044436 +0000 UTC m=+0.063047297 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd)
Oct  7 16:20:13 np0005474864 podman[227564]: 2025-10-07 20:20:13.622794027 +0000 UTC m=+0.173672641 container start 26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:20:13 np0005474864 podman[227577]: 2025-10-07 20:20:13.641142686 +0000 UTC m=+0.090311612 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:20:13 np0005474864 neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d[227586]: [NOTICE]   (227636) : New worker (227642) forked
Oct  7 16:20:13 np0005474864 neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d[227586]: [NOTICE]   (227636) : Loading success.
Oct  7 16:20:13 np0005474864 podman[227580]: 2025-10-07 20:20:13.662843701 +0000 UTC m=+0.107563709 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.726 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868413.725288, 30de244b-c8b3-47e1-99a2-f00752af916f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.727 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] VM Started (Lifecycle Event)#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.756 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.763 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868413.726107, 30de244b-c8b3-47e1-99a2-f00752af916f => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.764 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.785 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.790 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:20:13 np0005474864 nova_compute[192593]: 2025-10-07 20:20:13.813 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.031 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.633 2 DEBUG nova.compute.manager [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.634 2 DEBUG oslo_concurrency.lockutils [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.634 2 DEBUG oslo_concurrency.lockutils [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.635 2 DEBUG oslo_concurrency.lockutils [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.636 2 DEBUG nova.compute.manager [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Processing event network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.636 2 DEBUG nova.compute.manager [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.637 2 DEBUG oslo_concurrency.lockutils [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.637 2 DEBUG oslo_concurrency.lockutils [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.637 2 DEBUG oslo_concurrency.lockutils [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.638 2 DEBUG nova.compute.manager [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] No waiting events found dispatching network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.638 2 WARNING nova.compute.manager [req-5b35c2a3-93ea-4263-a300-be157fc10de9 req-1ec94de6-41c8-4e2d-84ee-d39ad3b54e89 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received unexpected event network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.640 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.644 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868414.6441097, 30de244b-c8b3-47e1-99a2-f00752af916f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.645 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.647 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.652 2 INFO nova.virt.libvirt.driver [-] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Instance spawned successfully.#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.652 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.680 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.687 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.687 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.688 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.688 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.689 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.689 2 DEBUG nova.virt.libvirt.driver [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.695 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.749 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.771 2 INFO nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Took 9.15 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.771 2 DEBUG nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.867 2 INFO nova.compute.manager [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Took 10.43 seconds to build instance.#033[00m
Oct  7 16:20:14 np0005474864 nova_compute[192593]: 2025-10-07 20:20:14.907 2 DEBUG oslo_concurrency.lockutils [None req-702703f9-d9b6-48f9-900a-e6f2b05e4913 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.765s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:16.194 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:16.195 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:16.195 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:16 np0005474864 nova_compute[192593]: 2025-10-07 20:20:16.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:19 np0005474864 nova_compute[192593]: 2025-10-07 20:20:19.033 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:20 np0005474864 nova_compute[192593]: 2025-10-07 20:20:20.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:20 np0005474864 NetworkManager[51631]: <info>  [1759868420.4770] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Oct  7 16:20:20 np0005474864 NetworkManager[51631]: <info>  [1759868420.4791] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Oct  7 16:20:20 np0005474864 nova_compute[192593]: 2025-10-07 20:20:20.590 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:20 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:20Z|00220|binding|INFO|Releasing lport 896ce339-edeb-4cb5-8a18-42b2beaa06e9 from this chassis (sb_readonly=0)
Oct  7 16:20:20 np0005474864 nova_compute[192593]: 2025-10-07 20:20:20.619 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:21 np0005474864 podman[227655]: 2025-10-07 20:20:21.38357904 +0000 UTC m=+0.073527668 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:20:21 np0005474864 nova_compute[192593]: 2025-10-07 20:20:21.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:23 np0005474864 podman[227672]: 2025-10-07 20:20:23.366326372 +0000 UTC m=+0.058099754 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:20:24 np0005474864 nova_compute[192593]: 2025-10-07 20:20:24.035 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:25 np0005474864 nova_compute[192593]: 2025-10-07 20:20:25.984 2 DEBUG nova.compute.manager [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-changed-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:25 np0005474864 nova_compute[192593]: 2025-10-07 20:20:25.985 2 DEBUG nova.compute.manager [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Refreshing instance network info cache due to event network-changed-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:20:25 np0005474864 nova_compute[192593]: 2025-10-07 20:20:25.987 2 DEBUG oslo_concurrency.lockutils [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:25 np0005474864 nova_compute[192593]: 2025-10-07 20:20:25.987 2 DEBUG oslo_concurrency.lockutils [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:25 np0005474864 nova_compute[192593]: 2025-10-07 20:20:25.987 2 DEBUG nova.network.neutron [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Refreshing network info cache for port 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.119 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.120 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.120 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.120 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.200 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.295 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.297 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.397 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.617 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.620 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5547MB free_disk=73.43553924560547GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.620 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.621 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.702 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 30de244b-c8b3-47e1-99a2-f00752af916f actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.703 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.703 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.721 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing inventories for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.743 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating ProviderTree inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.743 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.757 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing aggregate associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.774 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing trait associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.815 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.834 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.863 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.864 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.243s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:26 np0005474864 nova_compute[192593]: 2025-10-07 20:20:26.957 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:27 np0005474864 podman[227721]: 2025-10-07 20:20:27.374088491 +0000 UTC m=+0.066086574 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:20:27 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:27Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:43:48:ff 10.100.0.10
Oct  7 16:20:27 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:27Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:48:ff 10.100.0.10
Oct  7 16:20:27 np0005474864 nova_compute[192593]: 2025-10-07 20:20:27.864 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:27 np0005474864 nova_compute[192593]: 2025-10-07 20:20:27.864 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:28 np0005474864 nova_compute[192593]: 2025-10-07 20:20:28.087 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:29 np0005474864 nova_compute[192593]: 2025-10-07 20:20:29.037 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:29 np0005474864 nova_compute[192593]: 2025-10-07 20:20:29.592 2 DEBUG nova.network.neutron [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updated VIF entry in instance network info cache for port 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:20:29 np0005474864 nova_compute[192593]: 2025-10-07 20:20:29.592 2 DEBUG nova.network.neutron [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updating instance_info_cache with network_info: [{"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:29 np0005474864 nova_compute[192593]: 2025-10-07 20:20:29.627 2 DEBUG oslo_concurrency.lockutils [req-1b90668a-479d-4dbe-bfcf-2ca9fe23b358 req-eb1c57e4-652f-4961-bbaf-f755df3b048f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:30 np0005474864 nova_compute[192593]: 2025-10-07 20:20:30.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:30 np0005474864 nova_compute[192593]: 2025-10-07 20:20:30.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:20:30 np0005474864 nova_compute[192593]: 2025-10-07 20:20:30.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:20:30 np0005474864 nova_compute[192593]: 2025-10-07 20:20:30.555 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:30 np0005474864 nova_compute[192593]: 2025-10-07 20:20:30.556 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquired lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:30 np0005474864 nova_compute[192593]: 2025-10-07 20:20:30.556 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  7 16:20:30 np0005474864 nova_compute[192593]: 2025-10-07 20:20:30.557 2 DEBUG nova.objects.instance [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 30de244b-c8b3-47e1-99a2-f00752af916f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.257 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'name': 'tempest-TestNetworkBasicOps-server-1488044028', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002a', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '57491b24c6b2419c842483a87c8b4d42', 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'hostId': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.257 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.273 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.274 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b5075a2d-ecb9-43c4-978a-734f2085b9f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.257945', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '127adba0-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.207747278, 'message_signature': '834ca3ef863a1eefdb178cffec36695dc3d5be2a76f8c5b20c101affca05431b'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.257945', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '127ae816-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.207747278, 'message_signature': 'ffc812ba9052b5394b44b0f371553b85cbc6c0e9d27b2fd68c09516db18f6370'}]}, 'timestamp': '2025-10-07 20:20:31.274336', '_unique_id': '7666c982ab3941d7ad076239a3fe864f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.275 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.276 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.280 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 30de244b-c8b3-47e1-99a2-f00752af916f / tap9d52d9d8-41 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.280 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a1be9360-578e-4ac6-a70e-7e7038453713', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.276645', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '127be338-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': 'f71150e201d78454dd98c54b81e99d9675efe27691f789a40e99d9d0e48ae7bc'}]}, 'timestamp': '2025-10-07 20:20:31.280773', '_unique_id': 'fcb87e745a7a4922894a49eaa4297a10'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.281 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.282 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.282 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.282 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9905284-f921-4d96-8392-4ef337d5c976', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.282594', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '127c36bc-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.207747278, 'message_signature': '4da16d1df7914eea03912349e8b68b1af05e00ed2ce9fd2f4ed9ef362b9eb991'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.282594', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '127c4058-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.207747278, 'message_signature': '1d6b8ca1d76d82889aefe8162625f8f030fe0573a2435d87e322aad4ab08cbf9'}]}, 'timestamp': '2025-10-07 20:20:31.283130', '_unique_id': '1139de96633f41b1947cf554cc162de8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.283 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.284 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.284 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.284 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '57830edf-0e34-4857-af30-f6673a761ea2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.284580', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '127c83d8-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.207747278, 'message_signature': 'e996a3c568b69bd455f7c5b3c26c914c3a4ede3926dd328b2785d03942674314'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.284580', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '127c8d6a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.207747278, 'message_signature': '8aa76eac0970e8d87235cc7dae596540e0d02a029848fee9216db70c369c8690'}]}, 'timestamp': '2025-10-07 20:20:31.285100', '_unique_id': '5697a1e0416445afa2043f6bad09d035'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.285 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.286 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.287 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.287 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>]
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.287 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.287 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'dedc1047-edc4-443b-954e-54a165ccd193', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.287637', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '127cfde0-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': '41e713f1e95494eed78469150844532b26a8f4be5dafc5f28a6314a560cecd65'}]}, 'timestamp': '2025-10-07 20:20:31.288023', '_unique_id': 'da074f53b7d248a3835388e34cd024f3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.288 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.290 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.320 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.321 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ab3ead7-e83d-4ed5-aa35-2543f14828a7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.290154', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '128207cc-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': '98395726e884374afc9905dc23184d1489c567b21246ac21491c6a5db2bcf83c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.290154', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12822234-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': '8c68bcff08ffaa34da1f046381af08993b8b61780cf73ba18629587462ce334e'}]}, 'timestamp': '2025-10-07 20:20:31.321789', '_unique_id': '865958a8929f4bd28d4f9ccaf72e417a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.323 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.325 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.325 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>]
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.325 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.326 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.write.latency volume: 3605783060 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.326 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '02385846-3405-4e13-9d75-686e29b19f88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3605783060, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.325990', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '1282dcf6-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': 'f536a9f0ef1702a82891ee3f05c586d3ef48f6f11cb765e6187ff67f244bea7b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.325990', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '1282f10a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': 'e4021df38d3e13b5e5dc15822f3587238ab2c0e87629c6a21554e728a19eee63'}]}, 'timestamp': '2025-10-07 20:20:31.327093', '_unique_id': 'c5f5972802544816aadfb370ee5413ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.328 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.329 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.353 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1b48548e-4968-437f-980d-b8f7dd3838c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'timestamp': '2025-10-07T20:20:31.329994', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '12872c70-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.303455654, 'message_signature': '215c999c4a510d857eeec0a30baefb109f5df02c69e223ac8255b92e14d82344'}]}, 'timestamp': '2025-10-07 20:20:31.354901', '_unique_id': '6f16c79895fd493abb4f28292a04b042'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.356 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.358 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.359 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'be6de763-693d-43e9-a912-e7ee1ed8090d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.359365', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '1287fb8c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': '880bcac998a84028306101042e7e583cd583dfda3f53f746de748963cac051f9'}]}, 'timestamp': '2025-10-07 20:20:31.360359', '_unique_id': '923edfeccc8a42b39140ddc830f3479a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.362 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.364 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.365 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.incoming.bytes volume: 2044 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '359b8b22-3c84-4652-9d23-e2c5386afde1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2044, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.365136', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '1288ddae-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': '66f77df6d84f55d76e30ccf12773d4691c8e6f26d64d9e6b71b8f990e207caa7'}]}, 'timestamp': '2025-10-07 20:20:31.366051', '_unique_id': '76e9de6931cd4bdda9a522076b731acc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.367 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.370 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0f76d20-f10b-4272-824c-da14d5330aa5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.370490', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '1289aca2-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': '776c61aae8c4a35357f8297b6db16df9416f53cb5b729cee403135ac4d59a090'}]}, 'timestamp': '2025-10-07 20:20:31.371383', '_unique_id': '4df1dc7cbaca47f0bcb9e517fed95232'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.372 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.375 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.read.requests volume: 1089 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.376 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '69fc6775-079d-4d27-97df-10a7a26b57f8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1089, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.375441', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '128a71aa-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': 'a0fb684ce4bff3a9699e4058b8ea5e34dedba9a97c630697f8ec8bcee1d96c88'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.375441', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '128a8b90-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': 'ccc3de4558bd438fff4c7dcae7e29f024dcfa0bab39f71b4516fd764fd26a047'}]}, 'timestamp': '2025-10-07 20:20:31.376943', '_unique_id': '25f4de1ca84a4e118efff75db018d548'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.379 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.380 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.380 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '860f37c4-86de-4b69-b461-e6d863ebf654', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.380818', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '128b38f6-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': '9a80b04ab9092eee58712802bc483fe6eb94b6d836314d1cdccd15e1fdac5e20'}]}, 'timestamp': '2025-10-07 20:20:31.381427', '_unique_id': '606f7696187744f78e4c6e518d983b06'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.382 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.384 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.384 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.read.latency volume: 506429486 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.384 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.read.latency volume: 133235764 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2a540d26-109b-4dc5-8766-a62dbd417096', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 506429486, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.384337', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '128bc44c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': '3a2adc99022d3d053bf3f5f7a5b693534061b84364fc275cbdc0b5e5f2fae820'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 133235764, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.384337', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '128bd9aa-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': '8a0632f523992c9d92dd110d0f5562802a73010aa876d4191be3c3c2d87c92f4'}]}, 'timestamp': '2025-10-07 20:20:31.385506', '_unique_id': '500b26ed46c242ebba197da0a16cd920'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.386 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.389 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.389 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7d5e5f3c-8d1b-4d6f-ba69-b93d3179735a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.389499', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '128c8c56-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': '2836c23394570cc998a5d2c11b5214f4ccbdc9439c95febd01c75b796201f413'}]}, 'timestamp': '2025-10-07 20:20:31.390115', '_unique_id': 'f85582287d7a4887909fad11945ef274'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.391 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.394 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.394 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.read.bytes volume: 30292480 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.395 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '406d90fa-0f11-45da-bcc5-6dd9af674053', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30292480, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.394592', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '128d56ae-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': '92f5d978dd434ee5fe5d96eb561cc7216376fb107e478f6a1643e8129a9ef464'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.394592', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '128dbcfc-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': 'ae52e9de773a6c822c9101797439b2b3ed8d5b49d7717d170ebe70897c01f790'}]}, 'timestamp': '2025-10-07 20:20:31.397989', '_unique_id': '17e3b7876d154b6fa1f3a322a9b6b437'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.401 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.405 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.405 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.405 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>]
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.406 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.406 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22b6e6ed-3618-46a7-8104-aa74beea5612', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.406688', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '128f33ac-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': '41922854f2d7d10ce7453d6b34d724a335e49c0cba16d4e3b3ca7c247c20ae50'}]}, 'timestamp': '2025-10-07 20:20:31.407598', '_unique_id': '5d52a0c31931477b849a8b081ec9cdbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.409 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.412 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.412 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df0761a8-ddd8-41cc-bd68-ba052f9aa6c1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.412363', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '12900cdc-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': 'fc3af481c6281dccf2903bc9ac4036a60b435c6f58ae89b8525c1625c163e68e'}]}, 'timestamp': '2025-10-07 20:20:31.413046', '_unique_id': '91cf901d5e934ca3833e685425798d19'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.414 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.416 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.416 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '339fc93a-1b71-42d6-8721-aa3f713c5622', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': 'instance-0000002a-30de244b-c8b3-47e1-99a2-f00752af916f-tap9d52d9d8-41', 'timestamp': '2025-10-07T20:20:31.416899', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'tap9d52d9d8-41', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:43:48:ff', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap9d52d9d8-41'}, 'message_id': '1290bf42-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.226465457, 'message_signature': 'c1d74bcc4f7fba527db2d2c01f225c5a8166c042786a8b7c634e41de40e1abf1'}]}, 'timestamp': '2025-10-07 20:20:31.417619', '_unique_id': 'fe1e4a4e295b4ad0ad0065c3f4cd7feb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.418 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.421 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.421 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/cpu volume: 11110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '769402e6-271b-426b-9c4e-72496ec25276', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11110000000, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'timestamp': '2025-10-07T20:20:31.421456', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '12916f00-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.303455654, 'message_signature': '98d0ae13feb7361c5b8367fec9e2aa1e13d843ca7351cbcfe6daa338d952aeca'}]}, 'timestamp': '2025-10-07 20:20:31.422080', '_unique_id': '2d1d85c0c8cf45698eadf86f945820cf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.423 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.425 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.426 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.426 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1488044028>]
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.426 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.426 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.427 12 DEBUG ceilometer.compute.pollsters [-] 30de244b-c8b3-47e1-99a2-f00752af916f/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9edccdf-884e-4af0-bc67-5e13f5b8d324', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-vda', 'timestamp': '2025-10-07T20:20:31.426873', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '129243e4-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': 'c2daa7909afa132dd848c1e499c0b346bd407e20f7acb567346d0e3b77f29a30'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': 'fde8db13cdde4728903e9d2749f853e1', 'user_name': None, 'project_id': '57491b24c6b2419c842483a87c8b4d42', 'project_name': None, 'resource_id': '30de244b-c8b3-47e1-99a2-f00752af916f-sda', 'timestamp': '2025-10-07T20:20:31.426873', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1488044028', 'name': 'instance-0000002a', 'instance_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'instance_type': 'm1.nano', 'host': '39e42a4ba822ee12dd3744c0d144dafc09aa2d88ec52fa8e66530692', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '129256fe-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4067.239985987, 'message_signature': '7fd0a056d78e078f8b96f5b5a2236e3255f6b9a17c06868497d2b156d3e79bc9'}]}, 'timestamp': '2025-10-07 20:20:31.427922', '_unique_id': 'fa578b3d35e849a1b59f8e7893a6e9ce'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:20:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:20:31.429 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:20:31 np0005474864 nova_compute[192593]: 2025-10-07 20:20:31.960 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:33 np0005474864 nova_compute[192593]: 2025-10-07 20:20:33.784 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updating instance_info_cache with network_info: [{"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:33 np0005474864 nova_compute[192593]: 2025-10-07 20:20:33.813 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Releasing lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:33 np0005474864 nova_compute[192593]: 2025-10-07 20:20:33.814 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  7 16:20:33 np0005474864 nova_compute[192593]: 2025-10-07 20:20:33.816 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:33 np0005474864 nova_compute[192593]: 2025-10-07 20:20:33.816 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:33 np0005474864 nova_compute[192593]: 2025-10-07 20:20:33.816 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:20:33 np0005474864 nova_compute[192593]: 2025-10-07 20:20:33.816 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:20:34 np0005474864 nova_compute[192593]: 2025-10-07 20:20:34.040 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:34 np0005474864 nova_compute[192593]: 2025-10-07 20:20:34.699 2 INFO nova.compute.manager [None req-ec3493df-2ff4-400f-a429-a34e1f4f867d fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Get console output#033[00m
Oct  7 16:20:34 np0005474864 nova_compute[192593]: 2025-10-07 20:20:34.707 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:20:35 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:35Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:48:ff 10.100.0.10
Oct  7 16:20:37 np0005474864 nova_compute[192593]: 2025-10-07 20:20:37.005 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:37 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:37Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:48:ff 10.100.0.10
Oct  7 16:20:39 np0005474864 nova_compute[192593]: 2025-10-07 20:20:39.042 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:39 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:39Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:43:48:ff 10.100.0.10
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.374 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.375 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.375 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.376 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.376 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.378 2 INFO nova.compute.manager [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Terminating instance#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.380 2 DEBUG nova.compute.manager [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:20:40 np0005474864 podman[227742]: 2025-10-07 20:20:40.381759072 +0000 UTC m=+0.075035111 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:20:40 np0005474864 podman[227743]: 2025-10-07 20:20:40.392525822 +0000 UTC m=+0.077356468 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, vcs-type=git, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, distribution-scope=public, maintainer=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.openshift.expose-services=, version=9.6, io.buildah.version=1.33.7)
Oct  7 16:20:40 np0005474864 kernel: tap9d52d9d8-41 (unregistering): left promiscuous mode
Oct  7 16:20:40 np0005474864 NetworkManager[51631]: <info>  [1759868440.4045] device (tap9d52d9d8-41): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:20:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:40Z|00221|binding|INFO|Releasing lport 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 from this chassis (sb_readonly=0)
Oct  7 16:20:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:40Z|00222|binding|INFO|Setting lport 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 down in Southbound
Oct  7 16:20:40 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:40Z|00223|binding|INFO|Removing iface tap9d52d9d8-41 ovn-installed in OVS
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.421 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:43:48:ff 10.100.0.10'], port_security=['fa:16:3e:43:48:ff 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '30de244b-c8b3-47e1-99a2-f00752af916f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '57491b24c6b2419c842483a87c8b4d42', 'neutron:revision_number': '4', 'neutron:security_group_ids': '20f1f1d1-1339-4926-afbc-e9bfffee82bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4e52e3f0-bafc-4319-a52a-27ab7ae25fb4, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.425 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 in datapath e65ee0da-6c97-4834-a9da-4a86620baf5d unbound from our chassis#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.428 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e65ee0da-6c97-4834-a9da-4a86620baf5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.430 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c3355cce-b47b-4f64-9fdc-b03d326f44fd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.432 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d namespace which is not needed anymore#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.446 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.447 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.473 2 DEBUG nova.compute.manager [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-changed-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.474 2 DEBUG nova.compute.manager [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Refreshing instance network info cache due to event network-changed-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:20:40 np0005474864 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Oct  7 16:20:40 np0005474864 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000002a.scope: Consumed 13.795s CPU time.
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.474 2 DEBUG oslo_concurrency.lockutils [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.474 2 DEBUG oslo_concurrency.lockutils [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.475 2 DEBUG nova.network.neutron [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Refreshing network info cache for port 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:20:40 np0005474864 systemd-machined[152586]: Machine qemu-14-instance-0000002a terminated.
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.481 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.553 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.554 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.563 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.563 2 INFO nova.compute.claims [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:20:40 np0005474864 neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d[227586]: [NOTICE]   (227636) : haproxy version is 2.8.14-c23fe91
Oct  7 16:20:40 np0005474864 neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d[227586]: [NOTICE]   (227636) : path to executable is /usr/sbin/haproxy
Oct  7 16:20:40 np0005474864 neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d[227586]: [WARNING]  (227636) : Exiting Master process...
Oct  7 16:20:40 np0005474864 neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d[227586]: [ALERT]    (227636) : Current worker (227642) exited with code 143 (Terminated)
Oct  7 16:20:40 np0005474864 neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d[227586]: [WARNING]  (227636) : All workers exited. Exiting... (0)
Oct  7 16:20:40 np0005474864 systemd[1]: libpod-26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e.scope: Deactivated successfully.
Oct  7 16:20:40 np0005474864 podman[227810]: 2025-10-07 20:20:40.622782632 +0000 UTC m=+0.064640172 container died 26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.662 2 INFO nova.virt.libvirt.driver [-] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Instance destroyed successfully.#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.663 2 DEBUG nova.objects.instance [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lazy-loading 'resources' on Instance uuid 30de244b-c8b3-47e1-99a2-f00752af916f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:20:40 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e-userdata-shm.mount: Deactivated successfully.
Oct  7 16:20:40 np0005474864 systemd[1]: var-lib-containers-storage-overlay-97fb75e5a9bb2b2a812cae0581a54ae7d4478b9ab6d7ad65eb0260dc712ae429-merged.mount: Deactivated successfully.
Oct  7 16:20:40 np0005474864 podman[227810]: 2025-10-07 20:20:40.682339487 +0000 UTC m=+0.124196997 container cleanup 26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.686 2 DEBUG nova.virt.libvirt.vif [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:20:02Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1488044028',display_name='tempest-TestNetworkBasicOps-server-1488044028',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1488044028',id=42,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBD3fP+kEZm1BDnVq1jZ5StbwasJe3y53EdxGsaTK8aISqUdvl2VgCBatFl3aTna8qxy93lplQmDnHOkiqmSZMOoitAgysFHYmhH01/JGskYdF7QWmUbGmk7TM9O9Qc7FbA==',key_name='tempest-TestNetworkBasicOps-509323753',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:20:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='57491b24c6b2419c842483a87c8b4d42',ramdisk_id='',reservation_id='r-zf35ijy7',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-666319938',owner_user_name='tempest-TestNetworkBasicOps-666319938-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:20:14Z,user_data=None,user_id='fde8db13cdde4728903e9d2749f853e1',uuid=30de244b-c8b3-47e1-99a2-f00752af916f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.687 2 DEBUG nova.network.os_vif_util [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converting VIF {"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "1.2.3.4", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.249", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.688 2 DEBUG nova.network.os_vif_util [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:43:48:ff,bridge_name='br-int',has_traffic_filtering=True,id=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503,network=Network(e65ee0da-6c97-4834-a9da-4a86620baf5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d52d9d8-41') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.688 2 DEBUG os_vif [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:48:ff,bridge_name='br-int',has_traffic_filtering=True,id=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503,network=Network(e65ee0da-6c97-4834-a9da-4a86620baf5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d52d9d8-41') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:20:40 np0005474864 systemd[1]: libpod-conmon-26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e.scope: Deactivated successfully.
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.691 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.692 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap9d52d9d8-41, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.698 2 DEBUG nova.compute.provider_tree [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.712 2 DEBUG nova.scheduler.client.report [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.725 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.729 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.731 2 INFO os_vif [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:43:48:ff,bridge_name='br-int',has_traffic_filtering=True,id=9d52d9d8-4162-4c1a-a1d3-e9539bb3c503,network=Network(e65ee0da-6c97-4834-a9da-4a86620baf5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap9d52d9d8-41')#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.732 2 INFO nova.virt.libvirt.driver [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Deleting instance files /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f_del#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.732 2 INFO nova.virt.libvirt.driver [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Deletion of /var/lib/nova/instances/30de244b-c8b3-47e1-99a2-f00752af916f_del complete#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.739 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.185s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.741 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:20:40 np0005474864 podman[227853]: 2025-10-07 20:20:40.775079057 +0000 UTC m=+0.065526677 container remove 26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.781 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[daae83c9-c7aa-4653-a61e-580b10464d6d]: (4, ('Tue Oct  7 08:20:40 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d (26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e)\n26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e\nTue Oct  7 08:20:40 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d (26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e)\n26698238da8ec6c21cb8d8e55b1a1046a4f4cb498928300e6373f1781e51a71e\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.786 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d8aa4d79-64d5-4f15-9a25-61cdfc2bdb30]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.787 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape65ee0da-60, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.791 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:40 np0005474864 kernel: tape65ee0da-60: left promiscuous mode
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.808 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4aa2e795-3037-4fa0-b7d5-49ac143f1458]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.815 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.816 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.828 2 INFO nova.compute.manager [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Took 0.45 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.829 2 DEBUG oslo.service.loopingcall [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.831 2 DEBUG nova.compute.manager [-] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.831 2 DEBUG nova.network.neutron [-] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.842 2 INFO nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.845 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ad0a3449-14d8-4587-9525-3aa29a997b28]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.847 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e0dbb8c1-916f-4b7a-8934-c53500a552f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.867 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.876 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[62420d54-4e70-4fda-8ad8-4188c8482810]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 404861, 'reachable_time': 18324, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 227867, 'error': None, 'target': 'ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 systemd[1]: run-netns-ovnmeta\x2de65ee0da\x2d6c97\x2d4834\x2da9da\x2d4a86620baf5d.mount: Deactivated successfully.
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.880 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e65ee0da-6c97-4834-a9da-4a86620baf5d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:20:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:40.880 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[d39f9fa3-553e-41b2-8383-d5b42dfabfa6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.975 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.976 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.977 2 INFO nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Creating image(s)#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.977 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "/var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.978 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.978 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:40 np0005474864 nova_compute[192593]: 2025-10-07 20:20:40.994 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.063 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.064 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.065 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.088 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.157 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.159 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.197 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.198 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.199 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.261 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.263 2 DEBUG nova.virt.disk.api [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Checking if we can resize image /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.264 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.329 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.331 2 DEBUG nova.virt.disk.api [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Cannot resize image /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.332 2 DEBUG nova.objects.instance [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'migration_context' on Instance uuid d50d0791-c234-4390-a519-3ce1c8561824 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.346 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.347 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Ensure instance console log exists: /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.347 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.348 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.348 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:41 np0005474864 nova_compute[192593]: 2025-10-07 20:20:41.775 2 DEBUG nova.policy [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.410 2 DEBUG nova.network.neutron [-] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.430 2 INFO nova.compute.manager [-] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Took 1.60 seconds to deallocate network for instance.#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.487 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.487 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.604 2 DEBUG nova.compute.provider_tree [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.621 2 DEBUG nova.scheduler.client.report [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.642 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.671 2 INFO nova.scheduler.client.report [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Deleted allocations for instance 30de244b-c8b3-47e1-99a2-f00752af916f#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.734 2 DEBUG oslo_concurrency.lockutils [None req-130b4b43-01dd-4325-94d4-7ab7b34a66b0 fde8db13cdde4728903e9d2749f853e1 57491b24c6b2419c842483a87c8b4d42 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.360s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.817 2 DEBUG nova.network.neutron [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updated VIF entry in instance network info cache for port 9d52d9d8-4162-4c1a-a1d3-e9539bb3c503. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.818 2 DEBUG nova.network.neutron [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Updating instance_info_cache with network_info: [{"id": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "address": "fa:16:3e:43:48:ff", "network": {"id": "e65ee0da-6c97-4834-a9da-4a86620baf5d", "bridge": "br-int", "label": "tempest-network-smoke--56963165", "subnets": [{"cidr": "10.100.0.0/28", "dns": [{"address": "9.8.7.6", "type": "dns", "version": 4, "meta": {}}], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "57491b24c6b2419c842483a87c8b4d42", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap9d52d9d8-41", "ovs_interfaceid": "9d52d9d8-4162-4c1a-a1d3-e9539bb3c503", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.837 2 DEBUG oslo_concurrency.lockutils [req-db5f80bd-7b63-402e-9df9-97e6fb7ac31a req-211515be-74c7-46a9-8cb3-0b5e097d67a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-30de244b-c8b3-47e1-99a2-f00752af916f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.874 2 DEBUG nova.compute.manager [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-vif-unplugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.874 2 DEBUG oslo_concurrency.lockutils [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.874 2 DEBUG oslo_concurrency.lockutils [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.875 2 DEBUG oslo_concurrency.lockutils [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.875 2 DEBUG nova.compute.manager [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] No waiting events found dispatching network-vif-unplugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.875 2 WARNING nova.compute.manager [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received unexpected event network-vif-unplugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.875 2 DEBUG nova.compute.manager [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.875 2 DEBUG oslo_concurrency.lockutils [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.876 2 DEBUG oslo_concurrency.lockutils [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.876 2 DEBUG oslo_concurrency.lockutils [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "30de244b-c8b3-47e1-99a2-f00752af916f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.876 2 DEBUG nova.compute.manager [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] No waiting events found dispatching network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.876 2 WARNING nova.compute.manager [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received unexpected event network-vif-plugged-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.877 2 DEBUG nova.compute.manager [req-9c063420-25fd-4b3a-a4ff-8fbeaa163fa6 req-0978bdc3-7ecc-4b35-8c52-3c7d7c100c1f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Received event network-vif-deleted-9d52d9d8-4162-4c1a-a1d3-e9539bb3c503 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:42 np0005474864 nova_compute[192593]: 2025-10-07 20:20:42.898 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Successfully created port: 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:20:43 np0005474864 nova_compute[192593]: 2025-10-07 20:20:43.513 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Successfully created port: 81d8ec64-ed9f-4338-b27a-8151379ca57b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:20:44 np0005474864 nova_compute[192593]: 2025-10-07 20:20:44.046 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:44 np0005474864 podman[227883]: 2025-10-07 20:20:44.37209599 +0000 UTC m=+0.061460841 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:20:44 np0005474864 podman[227885]: 2025-10-07 20:20:44.373967023 +0000 UTC m=+0.061572354 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  7 16:20:44 np0005474864 podman[227884]: 2025-10-07 20:20:44.427091323 +0000 UTC m=+0.115016593 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Oct  7 16:20:45 np0005474864 nova_compute[192593]: 2025-10-07 20:20:45.720 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Successfully updated port: 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:20:45 np0005474864 nova_compute[192593]: 2025-10-07 20:20:45.726 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:45 np0005474864 nova_compute[192593]: 2025-10-07 20:20:45.822 2 DEBUG nova.compute.manager [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-changed-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:45 np0005474864 nova_compute[192593]: 2025-10-07 20:20:45.823 2 DEBUG nova.compute.manager [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing instance network info cache due to event network-changed-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:20:45 np0005474864 nova_compute[192593]: 2025-10-07 20:20:45.824 2 DEBUG oslo_concurrency.lockutils [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:45 np0005474864 nova_compute[192593]: 2025-10-07 20:20:45.824 2 DEBUG oslo_concurrency.lockutils [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:45 np0005474864 nova_compute[192593]: 2025-10-07 20:20:45.824 2 DEBUG nova.network.neutron [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing network info cache for port 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.002 2 DEBUG nova.network.neutron [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.454 2 DEBUG nova.network.neutron [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.487 2 DEBUG oslo_concurrency.lockutils [req-0f5d3b52-0aaa-417f-857b-6aa1786c3807 req-8dfd5fa1-5e62-4efc-8d2a-75b6b441d005 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.716 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Successfully updated port: 81d8ec64-ed9f-4338-b27a-8151379ca57b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.732 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.733 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquired lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.733 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.837 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:46 np0005474864 nova_compute[192593]: 2025-10-07 20:20:46.917 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:20:47 np0005474864 nova_compute[192593]: 2025-10-07 20:20:47.924 2 DEBUG nova.compute.manager [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-changed-81d8ec64-ed9f-4338-b27a-8151379ca57b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:47 np0005474864 nova_compute[192593]: 2025-10-07 20:20:47.925 2 DEBUG nova.compute.manager [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing instance network info cache due to event network-changed-81d8ec64-ed9f-4338-b27a-8151379ca57b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:20:47 np0005474864 nova_compute[192593]: 2025-10-07 20:20:47.925 2 DEBUG oslo_concurrency.lockutils [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.047 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.100 2 DEBUG nova.network.neutron [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updating instance_info_cache with network_info: [{"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.136 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Releasing lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.136 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Instance network_info: |[{"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.137 2 DEBUG oslo_concurrency.lockutils [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.137 2 DEBUG nova.network.neutron [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing network info cache for port 81d8ec64-ed9f-4338-b27a-8151379ca57b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.143 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Start _get_guest_xml network_info=[{"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.153 2 WARNING nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.159 2 DEBUG nova.virt.libvirt.host [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.160 2 DEBUG nova.virt.libvirt.host [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.166 2 DEBUG nova.virt.libvirt.host [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.166 2 DEBUG nova.virt.libvirt.host [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.168 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.168 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.169 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.170 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.170 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.170 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.171 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.171 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.171 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.172 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.172 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.172 2 DEBUG nova.virt.hardware [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.177 2 DEBUG nova.virt.libvirt.vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:20:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-391741114',display_name='tempest-TestGettingAddress-server-391741114',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-391741114',id=43,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLepS7UZnZEM3zK2oLlFRRLXWOfEHmJt8uAk2zB865zu3amuSShBszwPYXlzJxSQhwgmArQdnWmVkU9wtlkXYFa4p8oAgakCfaAyEYsfUvVjD4w+Nydk19S66h6w3PLg6w==',key_name='tempest-TestGettingAddress-1398019678',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7u9w3wu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:20:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d50d0791-c234-4390-a519-3ce1c8561824,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.177 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.179 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:b3:fc,bridge_name='br-int',has_traffic_filtering=True,id=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6,network=Network(df8550c5-2041-43a8-b4e7-93cfe1bef135),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b2b5e3-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.180 2 DEBUG nova.virt.libvirt.vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:20:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-391741114',display_name='tempest-TestGettingAddress-server-391741114',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-391741114',id=43,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLepS7UZnZEM3zK2oLlFRRLXWOfEHmJt8uAk2zB865zu3amuSShBszwPYXlzJxSQhwgmArQdnWmVkU9wtlkXYFa4p8oAgakCfaAyEYsfUvVjD4w+Nydk19S66h6w3PLg6w==',key_name='tempest-TestGettingAddress-1398019678',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7u9w3wu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:20:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d50d0791-c234-4390-a519-3ce1c8561824,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.180 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.181 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:65:f1,bridge_name='br-int',has_traffic_filtering=True,id=81d8ec64-ed9f-4338-b27a-8151379ca57b,network=Network(59555e18-9ca2-4493-b3a1-3bf0b720e9a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81d8ec64-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.182 2 DEBUG nova.objects.instance [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid d50d0791-c234-4390-a519-3ce1c8561824 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.196 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <uuid>d50d0791-c234-4390-a519-3ce1c8561824</uuid>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <name>instance-0000002b</name>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestGettingAddress-server-391741114</nova:name>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:20:49</nova:creationTime>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:user uuid="334f092941fc46c496c7def76b2cfe18">tempest-TestGettingAddress-626136673-project-member</nova:user>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:project uuid="2f9bf744045540618c9980fd4a7694f5">tempest-TestGettingAddress-626136673</nova:project>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:port uuid="47b2b5e3-3d12-4a83-9b8f-7229649e4bc6">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        <nova:port uuid="81d8ec64-ed9f-4338-b27a-8151379ca57b">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:feab:65f1" ipVersion="6"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <entry name="serial">d50d0791-c234-4390-a519-3ce1c8561824</entry>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <entry name="uuid">d50d0791-c234-4390-a519-3ce1c8561824</entry>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk.config"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:3b:b3:fc"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <target dev="tap47b2b5e3-3d"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:ab:65:f1"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <target dev="tap81d8ec64-ed"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/console.log" append="off"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:20:49 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:20:49 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:20:49 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:20:49 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.197 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Preparing to wait for external event network-vif-plugged-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.198 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.199 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.199 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.199 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Preparing to wait for external event network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.199 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.199 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.199 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.201 2 DEBUG nova.virt.libvirt.vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:20:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-391741114',display_name='tempest-TestGettingAddress-server-391741114',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-391741114',id=43,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLepS7UZnZEM3zK2oLlFRRLXWOfEHmJt8uAk2zB865zu3amuSShBszwPYXlzJxSQhwgmArQdnWmVkU9wtlkXYFa4p8oAgakCfaAyEYsfUvVjD4w+Nydk19S66h6w3PLg6w==',key_name='tempest-TestGettingAddress-1398019678',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7u9w3wu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:20:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d50d0791-c234-4390-a519-3ce1c8561824,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.201 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.202 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:3b:b3:fc,bridge_name='br-int',has_traffic_filtering=True,id=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6,network=Network(df8550c5-2041-43a8-b4e7-93cfe1bef135),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b2b5e3-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.203 2 DEBUG os_vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:b3:fc,bridge_name='br-int',has_traffic_filtering=True,id=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6,network=Network(df8550c5-2041-43a8-b4e7-93cfe1bef135),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b2b5e3-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.204 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.204 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.205 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.208 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.208 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap47b2b5e3-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.208 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap47b2b5e3-3d, col_values=(('external_ids', {'iface-id': '47b2b5e3-3d12-4a83-9b8f-7229649e4bc6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:3b:b3:fc', 'vm-uuid': 'd50d0791-c234-4390-a519-3ce1c8561824'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 NetworkManager[51631]: <info>  [1759868449.2122] manager: (tap47b2b5e3-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.218 2 INFO os_vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:3b:b3:fc,bridge_name='br-int',has_traffic_filtering=True,id=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6,network=Network(df8550c5-2041-43a8-b4e7-93cfe1bef135),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b2b5e3-3d')#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.219 2 DEBUG nova.virt.libvirt.vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:20:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-391741114',display_name='tempest-TestGettingAddress-server-391741114',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-391741114',id=43,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLepS7UZnZEM3zK2oLlFRRLXWOfEHmJt8uAk2zB865zu3amuSShBszwPYXlzJxSQhwgmArQdnWmVkU9wtlkXYFa4p8oAgakCfaAyEYsfUvVjD4w+Nydk19S66h6w3PLg6w==',key_name='tempest-TestGettingAddress-1398019678',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7u9w3wu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:20:40Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d50d0791-c234-4390-a519-3ce1c8561824,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.220 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.220 2 DEBUG nova.network.os_vif_util [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:65:f1,bridge_name='br-int',has_traffic_filtering=True,id=81d8ec64-ed9f-4338-b27a-8151379ca57b,network=Network(59555e18-9ca2-4493-b3a1-3bf0b720e9a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81d8ec64-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.220 2 DEBUG os_vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:65:f1,bridge_name='br-int',has_traffic_filtering=True,id=81d8ec64-ed9f-4338-b27a-8151379ca57b,network=Network(59555e18-9ca2-4493-b3a1-3bf0b720e9a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81d8ec64-ed') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.221 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.221 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.224 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap81d8ec64-ed, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.224 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap81d8ec64-ed, col_values=(('external_ids', {'iface-id': '81d8ec64-ed9f-4338-b27a-8151379ca57b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ab:65:f1', 'vm-uuid': 'd50d0791-c234-4390-a519-3ce1c8561824'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 NetworkManager[51631]: <info>  [1759868449.2259] manager: (tap81d8ec64-ed): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.235 2 INFO os_vif [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:65:f1,bridge_name='br-int',has_traffic_filtering=True,id=81d8ec64-ed9f-4338-b27a-8151379ca57b,network=Network(59555e18-9ca2-4493-b3a1-3bf0b720e9a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81d8ec64-ed')#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.297 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.297 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.297 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:3b:b3:fc, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.298 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:ab:65:f1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:20:49 np0005474864 nova_compute[192593]: 2025-10-07 20:20:49.298 2 INFO nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Using config drive#033[00m
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.375 2 INFO nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Creating config drive at /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk.config#033[00m
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.380 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwuchufr execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.526 2 DEBUG oslo_concurrency.processutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmptwuchufr" returned: 0 in 0.146s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6131] manager: (tap47b2b5e3-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/119)
Oct  7 16:20:50 np0005474864 kernel: tap47b2b5e3-3d: entered promiscuous mode
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00224|binding|INFO|Claiming lport 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 for this chassis.
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00225|binding|INFO|47b2b5e3-3d12-4a83-9b8f-7229649e4bc6: Claiming fa:16:3e:3b:b3:fc 10.100.0.4
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.633 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6402] manager: (tap81d8ec64-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Oct  7 16:20:50 np0005474864 kernel: tap81d8ec64-ed: entered promiscuous mode
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6442] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/121)
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.643 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6454] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.648 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:b3:fc 10.100.0.4'], port_security=['fa:16:3e:3b:b3:fc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd50d0791-c234-4390-a519-3ce1c8561824', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cccb65b5-c6db-4579-9026-34d6963f156f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f97db387-40f7-48d5-82ac-67eefd868507, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.649 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 in datapath df8550c5-2041-43a8-b4e7-93cfe1bef135 bound to our chassis#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.651 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network df8550c5-2041-43a8-b4e7-93cfe1bef135#033[00m
Oct  7 16:20:50 np0005474864 systemd-udevd[227973]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:20:50 np0005474864 systemd-udevd[227974]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.667 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[02519688-b046-4c3a-a73f-93fcc454e5c2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.668 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdf8550c5-21 in ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.672 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdf8550c5-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.673 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fa65f37b-a4e5-47a0-8702-e5647658550b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.674 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c617db60-bc80-4fd7-b97d-122f99635292]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6759] device (tap47b2b5e3-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6778] device (tap47b2b5e3-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6792] device (tap81d8ec64-ed): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.6806] device (tap81d8ec64-ed): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.687 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[6d08ade3-725c-45f8-8ee4-7d35b5341392]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 systemd-machined[152586]: New machine qemu-15-instance-0000002b.
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.719 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a162663b-404e-4927-bc69-ea8894731a03]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.760 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad770d1-8d65-4613-9dbb-9ccd7ee321ff]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 systemd[1]: Started Virtual Machine qemu-15-instance-0000002b.
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.780 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2f21c224-9592-4ef1-8c3b-e934282ff0dd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.7831] manager: (tapdf8550c5-20): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Oct  7 16:20:50 np0005474864 systemd-udevd[227980]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00226|binding|INFO|Claiming lport 81d8ec64-ed9f-4338-b27a-8151379ca57b for this chassis.
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00227|binding|INFO|81d8ec64-ed9f-4338-b27a-8151379ca57b: Claiming fa:16:3e:ab:65:f1 2001:db8::f816:3eff:feab:65f1
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.825 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[e61fc951-9c0b-4350-ace6-453d4eb8491f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00228|binding|INFO|Setting lport 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 ovn-installed in OVS
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.829 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[27a68e6d-e711-4e81-be68-b39793c80b66]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.832 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00229|binding|INFO|Setting lport 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 up in Southbound
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.832 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:65:f1 2001:db8::f816:3eff:feab:65f1'], port_security=['fa:16:3e:ab:65:f1 2001:db8::f816:3eff:feab:65f1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feab:65f1/64', 'neutron:device_id': 'd50d0791-c234-4390-a519-3ce1c8561824', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'cccb65b5-c6db-4579-9026-34d6963f156f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=880882ab-db67-4f3b-b111-de501a51e81b, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=81d8ec64-ed9f-4338-b27a-8151379ca57b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00230|binding|INFO|Setting lport 81d8ec64-ed9f-4338-b27a-8151379ca57b ovn-installed in OVS
Oct  7 16:20:50 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:50Z|00231|binding|INFO|Setting lport 81d8ec64-ed9f-4338-b27a-8151379ca57b up in Southbound
Oct  7 16:20:50 np0005474864 nova_compute[192593]: 2025-10-07 20:20:50.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:50 np0005474864 NetworkManager[51631]: <info>  [1759868450.8587] device (tapdf8550c5-20): carrier: link connected
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.865 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[b7d07756-b3d2-4528-abf1-7e5c999bacab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.889 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[33bd8f60-dd10-4764-a772-1173ea967400]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf8550c5-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:b7:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408675, 'reachable_time': 39361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228009, 'error': None, 'target': 'ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.912 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c20be5ae-ed7b-430b-b0d7-70b91888d87e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe29:b713'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408675, 'tstamp': 408675}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228011, 'error': None, 'target': 'ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.941 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8301d6a9-abd4-44b6-bd19-2ed68d439d4b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdf8550c5-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:29:b7:13'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 69], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408675, 'reachable_time': 39361, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228012, 'error': None, 'target': 'ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:50 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:50.984 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[44fda135-7310-44bc-aac7-cf11ea4f5304]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.087 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b42e7a9d-085a-45f2-80e9-4ed31e8e8887]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.089 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf8550c5-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.089 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.089 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdf8550c5-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:51 np0005474864 kernel: tapdf8550c5-20: entered promiscuous mode
Oct  7 16:20:51 np0005474864 NetworkManager[51631]: <info>  [1759868451.0925] manager: (tapdf8550c5-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.094 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdf8550c5-20, col_values=(('external_ids', {'iface-id': '2231334b-6af3-4157-82a6-d9f48cff359c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.094 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:51 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:51Z|00232|binding|INFO|Releasing lport 2231334b-6af3-4157-82a6-d9f48cff359c from this chassis (sb_readonly=0)
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.113 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/df8550c5-2041-43a8-b4e7-93cfe1bef135.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/df8550c5-2041-43a8-b4e7-93cfe1bef135.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.114 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[25d9132a-e681-40aa-9113-959ae502c891]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.115 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-df8550c5-2041-43a8-b4e7-93cfe1bef135
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/df8550c5-2041-43a8-b4e7-93cfe1bef135.pid.haproxy
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID df8550c5-2041-43a8-b4e7-93cfe1bef135
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.117 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'env', 'PROCESS_TAG=haproxy-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/df8550c5-2041-43a8-b4e7-93cfe1bef135.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.305 2 DEBUG nova.compute.manager [req-2ff58337-3ea2-497f-98e3-a7fd390fa25b req-2581bfa8-97c1-42c2-bbda-664df4724f3c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.305 2 DEBUG oslo_concurrency.lockutils [req-2ff58337-3ea2-497f-98e3-a7fd390fa25b req-2581bfa8-97c1-42c2-bbda-664df4724f3c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.305 2 DEBUG oslo_concurrency.lockutils [req-2ff58337-3ea2-497f-98e3-a7fd390fa25b req-2581bfa8-97c1-42c2-bbda-664df4724f3c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.306 2 DEBUG oslo_concurrency.lockutils [req-2ff58337-3ea2-497f-98e3-a7fd390fa25b req-2581bfa8-97c1-42c2-bbda-664df4724f3c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.306 2 DEBUG nova.compute.manager [req-2ff58337-3ea2-497f-98e3-a7fd390fa25b req-2581bfa8-97c1-42c2-bbda-664df4724f3c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Processing event network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.477 2 DEBUG nova.network.neutron [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updated VIF entry in instance network info cache for port 81d8ec64-ed9f-4338-b27a-8151379ca57b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.478 2 DEBUG nova.network.neutron [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updating instance_info_cache with network_info: [{"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.497 2 DEBUG oslo_concurrency.lockutils [req-66cd340e-b335-45a3-a948-a3eb6cfdd761 req-8a39e4b6-f56c-4a5b-a529-1d06525fe92c 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:20:51 np0005474864 podman[228051]: 2025-10-07 20:20:51.564556068 +0000 UTC m=+0.071139380 container create 2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:20:51 np0005474864 systemd[1]: Started libpod-conmon-2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03.scope.
Oct  7 16:20:51 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:20:51 np0005474864 podman[228051]: 2025-10-07 20:20:51.533528454 +0000 UTC m=+0.040111796 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:20:51 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c1a3ce0bdb43d163a58f1eaa307aa1bbe5f7066862c3af7b10efac0e2a9d7ee9/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.638 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868451.6380327, d50d0791-c234-4390-a519-3ce1c8561824 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.639 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] VM Started (Lifecycle Event)#033[00m
Oct  7 16:20:51 np0005474864 podman[228051]: 2025-10-07 20:20:51.645480138 +0000 UTC m=+0.152063470 container init 2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, io.buildah.version=1.41.3)
Oct  7 16:20:51 np0005474864 podman[228051]: 2025-10-07 20:20:51.651489881 +0000 UTC m=+0.158073193 container start 2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  7 16:20:51 np0005474864 podman[228064]: 2025-10-07 20:20:51.662788066 +0000 UTC m=+0.058942678 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.673 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:51 np0005474864 neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135[228067]: [NOTICE]   (228086) : New worker (228089) forked
Oct  7 16:20:51 np0005474864 neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135[228067]: [NOTICE]   (228086) : Loading success.
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.678 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868451.6386638, d50d0791-c234-4390-a519-3ce1c8561824 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.678 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.701 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.705 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.713 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 81d8ec64-ed9f-4338-b27a-8151379ca57b in datapath 59555e18-9ca2-4493-b3a1-3bf0b720e9a1 unbound from our chassis#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.715 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 59555e18-9ca2-4493-b3a1-3bf0b720e9a1#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.729 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b47eef8b-496b-4f1e-9947-83b70a2293f1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.731 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap59555e18-91 in ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.734 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap59555e18-90 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.734 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[644d816b-165e-48fb-8218-1dbc36b7867e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.735 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cb96f491-1816-4374-8e81-058e41250bfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 nova_compute[192593]: 2025-10-07 20:20:51.742 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.748 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[2ed415e8-b21c-4766-997b-98e3a181a8e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.767 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4b9169f1-6ab0-4e4e-84db-6a5e6adcb6cc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.804 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[21664ec4-d75a-4dcb-b1b1-15c5d8cc0b21]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.811 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[39f106b3-88e0-4089-a94f-a90850882afd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 NetworkManager[51631]: <info>  [1759868451.8129] manager: (tap59555e18-90): new Veth device (/org/freedesktop/NetworkManager/Devices/125)
Oct  7 16:20:51 np0005474864 systemd-udevd[227989]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.854 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[843f1222-b299-429f-a5da-32a09280ecb0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.858 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[8d8af357-56ba-42a6-b2f3-cd08da1ba0e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 NetworkManager[51631]: <info>  [1759868451.8878] device (tap59555e18-90): carrier: link connected
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.896 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[a7518b83-e2e7-4819-8c34-158282c068d5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.928 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[617c7d31-6609-47ad-9581-ad2aeb813432]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59555e18-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:aa:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408778, 'reachable_time': 26658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228108, 'error': None, 'target': 'ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.951 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fa3e3949-ee73-41ed-9593-c5de64b4b8dd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe40:aa77'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 408778, 'tstamp': 408778}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228109, 'error': None, 'target': 'ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:51.978 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c56d5e65-b5c4-4a35-859c-17291be76636]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap59555e18-91'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:40:aa:77'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408778, 'reachable_time': 26658, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228110, 'error': None, 'target': 'ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.018 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a5fee7b9-2a8b-492f-8f60-e649582e1c93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.064 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b2a10ac5-7d43-41fd-a8c7-0cf98d2e1768]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.066 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59555e18-90, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.067 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.068 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap59555e18-90, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.070 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:52 np0005474864 NetworkManager[51631]: <info>  [1759868452.0713] manager: (tap59555e18-90): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Oct  7 16:20:52 np0005474864 kernel: tap59555e18-90: entered promiscuous mode
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.073 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.075 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap59555e18-90, col_values=(('external_ids', {'iface-id': '0a0b847e-5cc7-4076-b854-9b22eb923cee'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.076 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:20:52Z|00233|binding|INFO|Releasing lport 0a0b847e-5cc7-4076-b854-9b22eb923cee from this chassis (sb_readonly=0)
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.103 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.104 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/59555e18-9ca2-4493-b3a1-3bf0b720e9a1.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/59555e18-9ca2-4493-b3a1-3bf0b720e9a1.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.105 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[de75bf4e-bd58-4c4e-9666-4009da091013]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.106 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-59555e18-9ca2-4493-b3a1-3bf0b720e9a1
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/59555e18-9ca2-4493-b3a1-3bf0b720e9a1.pid.haproxy
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 59555e18-9ca2-4493-b3a1-3bf0b720e9a1
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:20:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:52.107 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'env', 'PROCESS_TAG=haproxy-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/59555e18-9ca2-4493-b3a1-3bf0b720e9a1.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.432 2 DEBUG nova.compute.manager [req-7e214abe-74bf-4fd1-85f0-270053c8dc9e req-bf55af16-11b4-49a7-8c57-8d0d4df616f8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-plugged-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.432 2 DEBUG oslo_concurrency.lockutils [req-7e214abe-74bf-4fd1-85f0-270053c8dc9e req-bf55af16-11b4-49a7-8c57-8d0d4df616f8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.433 2 DEBUG oslo_concurrency.lockutils [req-7e214abe-74bf-4fd1-85f0-270053c8dc9e req-bf55af16-11b4-49a7-8c57-8d0d4df616f8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.433 2 DEBUG oslo_concurrency.lockutils [req-7e214abe-74bf-4fd1-85f0-270053c8dc9e req-bf55af16-11b4-49a7-8c57-8d0d4df616f8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.433 2 DEBUG nova.compute.manager [req-7e214abe-74bf-4fd1-85f0-270053c8dc9e req-bf55af16-11b4-49a7-8c57-8d0d4df616f8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Processing event network-vif-plugged-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.434 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.439 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868452.4395325, d50d0791-c234-4390-a519-3ce1c8561824 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.440 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.442 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.446 2 INFO nova.virt.libvirt.driver [-] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Instance spawned successfully.#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.446 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.477 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.484 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.487 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.488 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.488 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.489 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.489 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.490 2 DEBUG nova.virt.libvirt.driver [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.521 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.559 2 INFO nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Took 11.58 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.560 2 DEBUG nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:52 np0005474864 podman[228140]: 2025-10-07 20:20:52.517358743 +0000 UTC m=+0.038422948 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:20:52 np0005474864 podman[228140]: 2025-10-07 20:20:52.627181085 +0000 UTC m=+0.148245280 container create 1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.633 2 INFO nova.compute.manager [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Took 12.11 seconds to build instance.#033[00m
Oct  7 16:20:52 np0005474864 nova_compute[192593]: 2025-10-07 20:20:52.649 2 DEBUG oslo_concurrency.lockutils [None req-0745e0e5-bf87-4bcb-a1bc-a546eb83fdea 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.202s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:52 np0005474864 systemd[1]: Started libpod-conmon-1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40.scope.
Oct  7 16:20:52 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:20:52 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c7842f16d5ad7182cdeca92c89ff85cc393f5babe7d97a485df76e8a0dffc13/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:20:52 np0005474864 podman[228140]: 2025-10-07 20:20:52.724768415 +0000 UTC m=+0.245832700 container init 1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.vendor=CentOS)
Oct  7 16:20:52 np0005474864 podman[228140]: 2025-10-07 20:20:52.736628646 +0000 UTC m=+0.257692881 container start 1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Oct  7 16:20:52 np0005474864 neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1[228155]: [NOTICE]   (228159) : New worker (228161) forked
Oct  7 16:20:52 np0005474864 neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1[228155]: [NOTICE]   (228159) : Loading success.
Oct  7 16:20:53 np0005474864 nova_compute[192593]: 2025-10-07 20:20:53.419 2 DEBUG nova.compute.manager [req-d19a7f66-144d-409b-b0d2-4f9fe9e328b6 req-aa5fb180-fcc9-4b34-b1ea-06ee8e33603e 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:53 np0005474864 nova_compute[192593]: 2025-10-07 20:20:53.420 2 DEBUG oslo_concurrency.lockutils [req-d19a7f66-144d-409b-b0d2-4f9fe9e328b6 req-aa5fb180-fcc9-4b34-b1ea-06ee8e33603e 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:53 np0005474864 nova_compute[192593]: 2025-10-07 20:20:53.420 2 DEBUG oslo_concurrency.lockutils [req-d19a7f66-144d-409b-b0d2-4f9fe9e328b6 req-aa5fb180-fcc9-4b34-b1ea-06ee8e33603e 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:53 np0005474864 nova_compute[192593]: 2025-10-07 20:20:53.420 2 DEBUG oslo_concurrency.lockutils [req-d19a7f66-144d-409b-b0d2-4f9fe9e328b6 req-aa5fb180-fcc9-4b34-b1ea-06ee8e33603e 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:53 np0005474864 nova_compute[192593]: 2025-10-07 20:20:53.421 2 DEBUG nova.compute.manager [req-d19a7f66-144d-409b-b0d2-4f9fe9e328b6 req-aa5fb180-fcc9-4b34-b1ea-06ee8e33603e 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] No waiting events found dispatching network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:20:53 np0005474864 nova_compute[192593]: 2025-10-07 20:20:53.421 2 WARNING nova.compute.manager [req-d19a7f66-144d-409b-b0d2-4f9fe9e328b6 req-aa5fb180-fcc9-4b34-b1ea-06ee8e33603e 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received unexpected event network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b for instance with vm_state active and task_state None.#033[00m
Oct  7 16:20:54 np0005474864 nova_compute[192593]: 2025-10-07 20:20:54.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:54 np0005474864 nova_compute[192593]: 2025-10-07 20:20:54.226 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:54 np0005474864 podman[228170]: 2025-10-07 20:20:54.370058978 +0000 UTC m=+0.070371647 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.042 2 DEBUG nova.compute.manager [req-9a5f79d0-022d-4b9b-af73-6db64726d016 req-6e831386-35fd-47b4-959b-45afbc461fe5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-plugged-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.042 2 DEBUG oslo_concurrency.lockutils [req-9a5f79d0-022d-4b9b-af73-6db64726d016 req-6e831386-35fd-47b4-959b-45afbc461fe5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.043 2 DEBUG oslo_concurrency.lockutils [req-9a5f79d0-022d-4b9b-af73-6db64726d016 req-6e831386-35fd-47b4-959b-45afbc461fe5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.043 2 DEBUG oslo_concurrency.lockutils [req-9a5f79d0-022d-4b9b-af73-6db64726d016 req-6e831386-35fd-47b4-959b-45afbc461fe5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.043 2 DEBUG nova.compute.manager [req-9a5f79d0-022d-4b9b-af73-6db64726d016 req-6e831386-35fd-47b4-959b-45afbc461fe5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] No waiting events found dispatching network-vif-plugged-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.043 2 WARNING nova.compute.manager [req-9a5f79d0-022d-4b9b-af73-6db64726d016 req-6e831386-35fd-47b4-959b-45afbc461fe5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received unexpected event network-vif-plugged-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.660 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868440.6600366, 30de244b-c8b3-47e1-99a2-f00752af916f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.661 2 INFO nova.compute.manager [-] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:20:55 np0005474864 nova_compute[192593]: 2025-10-07 20:20:55.715 2 DEBUG nova.compute.manager [None req-2220ee1a-a960-4b35-b878-46ab148aa817 - - - - - -] [instance: 30de244b-c8b3-47e1-99a2-f00752af916f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:20:56 np0005474864 nova_compute[192593]: 2025-10-07 20:20:56.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:58 np0005474864 podman[228195]: 2025-10-07 20:20:58.402461537 +0000 UTC m=+0.098754084 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.074 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.300 2 DEBUG nova.compute.manager [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-changed-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.300 2 DEBUG nova.compute.manager [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing instance network info cache due to event network-changed-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.301 2 DEBUG oslo_concurrency.lockutils [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.301 2 DEBUG oslo_concurrency.lockutils [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.301 2 DEBUG nova.network.neutron [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing network info cache for port 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:20:59 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:59.752 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:20:59 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:20:59.753 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 9 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:20:59 np0005474864 nova_compute[192593]: 2025-10-07 20:20:59.788 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:01Z|00234|binding|INFO|Releasing lport 0a0b847e-5cc7-4076-b854-9b22eb923cee from this chassis (sb_readonly=0)
Oct  7 16:21:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:01Z|00235|binding|INFO|Releasing lport 2231334b-6af3-4157-82a6-d9f48cff359c from this chassis (sb_readonly=0)
Oct  7 16:21:01 np0005474864 nova_compute[192593]: 2025-10-07 20:21:01.230 2 DEBUG nova.network.neutron [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updated VIF entry in instance network info cache for port 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:21:01 np0005474864 nova_compute[192593]: 2025-10-07 20:21:01.232 2 DEBUG nova.network.neutron [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updating instance_info_cache with network_info: [{"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:01 np0005474864 nova_compute[192593]: 2025-10-07 20:21:01.257 2 DEBUG oslo_concurrency.lockutils [req-e0682345-cf80-46ce-babf-94383468fd7f req-7e72f73f-b296-4785-97b6-c1af4cf072c3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:21:01 np0005474864 nova_compute[192593]: 2025-10-07 20:21:01.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:04 np0005474864 nova_compute[192593]: 2025-10-07 20:21:04.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:04 np0005474864 nova_compute[192593]: 2025-10-07 20:21:04.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:06 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:06Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:3b:b3:fc 10.100.0.4
Oct  7 16:21:06 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:06Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:3b:b3:fc 10.100.0.4
Oct  7 16:21:07 np0005474864 nova_compute[192593]: 2025-10-07 20:21:07.714 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:08.755 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:09 np0005474864 nova_compute[192593]: 2025-10-07 20:21:09.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:09 np0005474864 nova_compute[192593]: 2025-10-07 20:21:09.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:11 np0005474864 podman[228232]: 2025-10-07 20:21:11.383720545 +0000 UTC m=+0.070800379 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:21:11 np0005474864 podman[228233]: 2025-10-07 20:21:11.394452744 +0000 UTC m=+0.070998045 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, distribution-scope=public, version=9.6, io.openshift.expose-services=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., release=1755695350, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  7 16:21:14 np0005474864 nova_compute[192593]: 2025-10-07 20:21:14.082 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:14 np0005474864 nova_compute[192593]: 2025-10-07 20:21:14.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:15 np0005474864 podman[228278]: 2025-10-07 20:21:15.373155626 +0000 UTC m=+0.073807226 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, managed_by=edpm_ansible, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:21:15 np0005474864 podman[228280]: 2025-10-07 20:21:15.386164991 +0000 UTC m=+0.066559188 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001)
Oct  7 16:21:15 np0005474864 podman[228279]: 2025-10-07 20:21:15.413505698 +0000 UTC m=+0.105832698 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:21:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:16.195 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:16.196 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:16.197 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.837 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.837 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.838 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.838 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.839 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.841 2 INFO nova.compute.manager [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Terminating instance#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.843 2 DEBUG nova.compute.manager [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:21:18 np0005474864 kernel: tap47b2b5e3-3d (unregistering): left promiscuous mode
Oct  7 16:21:18 np0005474864 NetworkManager[51631]: <info>  [1759868478.8900] device (tap47b2b5e3-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:21:18 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:18Z|00236|binding|INFO|Releasing lport 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 from this chassis (sb_readonly=0)
Oct  7 16:21:18 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:18Z|00237|binding|INFO|Setting lport 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 down in Southbound
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:18 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:18Z|00238|binding|INFO|Removing iface tap47b2b5e3-3d ovn-installed in OVS
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.907 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.919 2 DEBUG nova.compute.manager [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-changed-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.919 2 DEBUG nova.compute.manager [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing instance network info cache due to event network-changed-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.920 2 DEBUG oslo_concurrency.lockutils [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.920 2 DEBUG oslo_concurrency.lockutils [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.920 2 DEBUG nova.network.neutron [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Refreshing network info cache for port 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:21:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:18.920 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3b:b3:fc 10.100.0.4'], port_security=['fa:16:3e:3b:b3:fc 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': 'd50d0791-c234-4390-a519-3ce1c8561824', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cccb65b5-c6db-4579-9026-34d6963f156f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=f97db387-40f7-48d5-82ac-67eefd868507, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:21:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:18.922 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 in datapath df8550c5-2041-43a8-b4e7-93cfe1bef135 unbound from our chassis#033[00m
Oct  7 16:21:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:18.925 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network df8550c5-2041-43a8-b4e7-93cfe1bef135, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:21:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:18.926 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2fc99e18-d30f-4976-b2e0-bbefab459098]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:18.927 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135 namespace which is not needed anymore#033[00m
Oct  7 16:21:18 np0005474864 kernel: tap81d8ec64-ed (unregistering): left promiscuous mode
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:18 np0005474864 NetworkManager[51631]: <info>  [1759868478.9440] device (tap81d8ec64-ed): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:21:18 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:18Z|00239|binding|INFO|Releasing lport 81d8ec64-ed9f-4338-b27a-8151379ca57b from this chassis (sb_readonly=0)
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.976 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:18 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:18Z|00240|binding|INFO|Setting lport 81d8ec64-ed9f-4338-b27a-8151379ca57b down in Southbound
Oct  7 16:21:18 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:18Z|00241|binding|INFO|Removing iface tap81d8ec64-ed ovn-installed in OVS
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.982 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:18 np0005474864 nova_compute[192593]: 2025-10-07 20:21:18.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.012 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:65:f1 2001:db8::f816:3eff:feab:65f1'], port_security=['fa:16:3e:ab:65:f1 2001:db8::f816:3eff:feab:65f1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:feab:65f1/64', 'neutron:device_id': 'd50d0791-c234-4390-a519-3ce1c8561824', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'cccb65b5-c6db-4579-9026-34d6963f156f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=880882ab-db67-4f3b-b111-de501a51e81b, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=81d8ec64-ed9f-4338-b27a-8151379ca57b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:21:19 np0005474864 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Oct  7 16:21:19 np0005474864 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000002b.scope: Consumed 14.061s CPU time.
Oct  7 16:21:19 np0005474864 systemd-machined[152586]: Machine qemu-15-instance-0000002b terminated.
Oct  7 16:21:19 np0005474864 NetworkManager[51631]: <info>  [1759868479.0703] manager: (tap47b2b5e3-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Oct  7 16:21:19 np0005474864 NetworkManager[51631]: <info>  [1759868479.0862] manager: (tap81d8ec64-ed): new Tun device (/org/freedesktop/NetworkManager/Devices/128)
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135[228067]: [NOTICE]   (228086) : haproxy version is 2.8.14-c23fe91
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135[228067]: [NOTICE]   (228086) : path to executable is /usr/sbin/haproxy
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135[228067]: [WARNING]  (228086) : Exiting Master process...
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135[228067]: [ALERT]    (228086) : Current worker (228089) exited with code 143 (Terminated)
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135[228067]: [WARNING]  (228086) : All workers exited. Exiting... (0)
Oct  7 16:21:19 np0005474864 systemd[1]: libpod-2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03.scope: Deactivated successfully.
Oct  7 16:21:19 np0005474864 podman[228368]: 2025-10-07 20:21:19.109745777 +0000 UTC m=+0.069356188 container died 2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.143 2 INFO nova.virt.libvirt.driver [-] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Instance destroyed successfully.#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.144 2 DEBUG nova.objects.instance [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'resources' on Instance uuid d50d0791-c234-4390-a519-3ce1c8561824 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:19 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03-userdata-shm.mount: Deactivated successfully.
Oct  7 16:21:19 np0005474864 systemd[1]: var-lib-containers-storage-overlay-c1a3ce0bdb43d163a58f1eaa307aa1bbe5f7066862c3af7b10efac0e2a9d7ee9-merged.mount: Deactivated successfully.
Oct  7 16:21:19 np0005474864 podman[228368]: 2025-10-07 20:21:19.156529994 +0000 UTC m=+0.116140385 container cleanup 2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.161 2 DEBUG nova.virt.libvirt.vif [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:20:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-391741114',display_name='tempest-TestGettingAddress-server-391741114',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-391741114',id=43,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLepS7UZnZEM3zK2oLlFRRLXWOfEHmJt8uAk2zB865zu3amuSShBszwPYXlzJxSQhwgmArQdnWmVkU9wtlkXYFa4p8oAgakCfaAyEYsfUvVjD4w+Nydk19S66h6w3PLg6w==',key_name='tempest-TestGettingAddress-1398019678',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:20:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7u9w3wu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:20:52Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d50d0791-c234-4390-a519-3ce1c8561824,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.161 2 DEBUG nova.network.os_vif_util [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.162 2 DEBUG nova.network.os_vif_util [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:3b:b3:fc,bridge_name='br-int',has_traffic_filtering=True,id=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6,network=Network(df8550c5-2041-43a8-b4e7-93cfe1bef135),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b2b5e3-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.163 2 DEBUG os_vif [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:b3:fc,bridge_name='br-int',has_traffic_filtering=True,id=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6,network=Network(df8550c5-2041-43a8-b4e7-93cfe1bef135),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b2b5e3-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.164 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.165 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap47b2b5e3-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.170 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.176 2 INFO os_vif [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:3b:b3:fc,bridge_name='br-int',has_traffic_filtering=True,id=47b2b5e3-3d12-4a83-9b8f-7229649e4bc6,network=Network(df8550c5-2041-43a8-b4e7-93cfe1bef135),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap47b2b5e3-3d')#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.177 2 DEBUG nova.virt.libvirt.vif [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:20:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-391741114',display_name='tempest-TestGettingAddress-server-391741114',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-391741114',id=43,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBLepS7UZnZEM3zK2oLlFRRLXWOfEHmJt8uAk2zB865zu3amuSShBszwPYXlzJxSQhwgmArQdnWmVkU9wtlkXYFa4p8oAgakCfaAyEYsfUvVjD4w+Nydk19S66h6w3PLg6w==',key_name='tempest-TestGettingAddress-1398019678',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:20:52Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7u9w3wu0',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:20:52Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=d50d0791-c234-4390-a519-3ce1c8561824,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.177 2 DEBUG nova.network.os_vif_util [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.179 2 DEBUG nova.network.os_vif_util [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ab:65:f1,bridge_name='br-int',has_traffic_filtering=True,id=81d8ec64-ed9f-4338-b27a-8151379ca57b,network=Network(59555e18-9ca2-4493-b3a1-3bf0b720e9a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81d8ec64-ed') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.179 2 DEBUG os_vif [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:65:f1,bridge_name='br-int',has_traffic_filtering=True,id=81d8ec64-ed9f-4338-b27a-8151379ca57b,network=Network(59555e18-9ca2-4493-b3a1-3bf0b720e9a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81d8ec64-ed') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.181 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.181 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap81d8ec64-ed, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.183 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 systemd[1]: libpod-conmon-2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03.scope: Deactivated successfully.
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.186 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.188 2 INFO os_vif [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ab:65:f1,bridge_name='br-int',has_traffic_filtering=True,id=81d8ec64-ed9f-4338-b27a-8151379ca57b,network=Network(59555e18-9ca2-4493-b3a1-3bf0b720e9a1),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap81d8ec64-ed')#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.189 2 INFO nova.virt.libvirt.driver [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Deleting instance files /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824_del#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.190 2 INFO nova.virt.libvirt.driver [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Deletion of /var/lib/nova/instances/d50d0791-c234-4390-a519-3ce1c8561824_del complete#033[00m
Oct  7 16:21:19 np0005474864 podman[228419]: 2025-10-07 20:21:19.231272156 +0000 UTC m=+0.048603001 container remove 2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3)
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.238 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cf5be8c0-cf20-4ac2-be74-dff47d0b21b3]: (4, ('Tue Oct  7 08:21:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135 (2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03)\n2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03\nTue Oct  7 08:21:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135 (2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03)\n2a13d920dc8e5fe7bbe17eda8e456b64ca354b7ab638064ca05001d54ad6fd03\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.239 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c64c5d42-e6a4-462b-a92f-75759cfd4e77]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.240 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdf8550c5-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:19 np0005474864 kernel: tapdf8550c5-20: left promiscuous mode
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.248 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.254 2 INFO nova.compute.manager [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.255 2 DEBUG oslo.service.loopingcall [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.255 2 DEBUG nova.compute.manager [-] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.255 2 DEBUG nova.network.neutron [-] [instance: d50d0791-c234-4390-a519-3ce1c8561824] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.261 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[04057832-5338-4ebd-bcc3-dd305c7b1a2d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.287 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e5436f9e-bb93-49bf-aace-a43e9ecb517c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.289 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7af815f5-844e-417f-87c4-2ae907e9fa24]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.301 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e92de1b6-8632-42fa-8c6c-db59576f82ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408664, 'reachable_time': 41222, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228435, 'error': None, 'target': 'ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 systemd[1]: run-netns-ovnmeta\x2ddf8550c5\x2d2041\x2d43a8\x2db4e7\x2d93cfe1bef135.mount: Deactivated successfully.
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.303 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-df8550c5-2041-43a8-b4e7-93cfe1bef135 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.303 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[7dc7153d-1bb0-4b5d-a9a9-954877627570]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.305 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 81d8ec64-ed9f-4338-b27a-8151379ca57b in datapath 59555e18-9ca2-4493-b3a1-3bf0b720e9a1 unbound from our chassis#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.306 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 59555e18-9ca2-4493-b3a1-3bf0b720e9a1, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.307 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8cbe2795-c92b-42b8-96cb-c3d6e166bb9e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.307 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1 namespace which is not needed anymore#033[00m
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1[228155]: [NOTICE]   (228159) : haproxy version is 2.8.14-c23fe91
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1[228155]: [NOTICE]   (228159) : path to executable is /usr/sbin/haproxy
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1[228155]: [WARNING]  (228159) : Exiting Master process...
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1[228155]: [ALERT]    (228159) : Current worker (228161) exited with code 143 (Terminated)
Oct  7 16:21:19 np0005474864 neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1[228155]: [WARNING]  (228159) : All workers exited. Exiting... (0)
Oct  7 16:21:19 np0005474864 systemd[1]: libpod-1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40.scope: Deactivated successfully.
Oct  7 16:21:19 np0005474864 podman[228453]: 2025-10-07 20:21:19.462899075 +0000 UTC m=+0.063213841 container died 1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:21:19 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40-userdata-shm.mount: Deactivated successfully.
Oct  7 16:21:19 np0005474864 systemd[1]: var-lib-containers-storage-overlay-0c7842f16d5ad7182cdeca92c89ff85cc393f5babe7d97a485df76e8a0dffc13-merged.mount: Deactivated successfully.
Oct  7 16:21:19 np0005474864 podman[228453]: 2025-10-07 20:21:19.508952821 +0000 UTC m=+0.109267627 container cleanup 1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  7 16:21:19 np0005474864 systemd[1]: libpod-conmon-1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40.scope: Deactivated successfully.
Oct  7 16:21:19 np0005474864 podman[228483]: 2025-10-07 20:21:19.594184636 +0000 UTC m=+0.059053642 container remove 1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.602 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0284524b-5d0e-49c2-a0da-56ed6626241b]: (4, ('Tue Oct  7 08:21:19 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1 (1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40)\n1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40\nTue Oct  7 08:21:19 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1 (1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40)\n1ba4af3e683a03fd38f51d2d66ca7775c91e060b577419e91135e926cb8fae40\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.604 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c6fe33-e78f-4320-a7b4-f13eefe9cae0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.606 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap59555e18-90, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.608 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 kernel: tap59555e18-90: left promiscuous mode
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.614 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fbe0a88f-0627-46d4-83d1-3e11ac286b20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.621 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.649 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f09843e1-c741-4831-810a-01614b533c09]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.651 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d8979e0e-492a-45e7-8cbf-34dcaf61e348]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.671 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cdc43068-4193-49fd-bbf9-9da13da70538]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 408769, 'reachable_time': 15810, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228498, 'error': None, 'target': 'ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.677 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-59555e18-9ca2-4493-b3a1-3bf0b720e9a1 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:21:19 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:19.677 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[85c95cde-46a5-4332-a1bb-9e234606d1b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:19 np0005474864 nova_compute[192593]: 2025-10-07 20:21:19.696 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:20 np0005474864 systemd[1]: run-netns-ovnmeta\x2d59555e18\x2d9ca2\x2d4493\x2db3a1\x2d3bf0b720e9a1.mount: Deactivated successfully.
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.223 2 DEBUG nova.compute.manager [req-34b923e2-47e5-4fb4-bb7c-2528829337fa req-fed7adfa-3458-45f5-822c-502111b8bbc5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-deleted-81d8ec64-ed9f-4338-b27a-8151379ca57b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.223 2 INFO nova.compute.manager [req-34b923e2-47e5-4fb4-bb7c-2528829337fa req-fed7adfa-3458-45f5-822c-502111b8bbc5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Neutron deleted interface 81d8ec64-ed9f-4338-b27a-8151379ca57b; detaching it from the instance and deleting it from the info cache#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.223 2 DEBUG nova.network.neutron [req-34b923e2-47e5-4fb4-bb7c-2528829337fa req-fed7adfa-3458-45f5-822c-502111b8bbc5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updating instance_info_cache with network_info: [{"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.241", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.250 2 DEBUG nova.compute.manager [req-34b923e2-47e5-4fb4-bb7c-2528829337fa req-fed7adfa-3458-45f5-822c-502111b8bbc5 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Detach interface failed, port_id=81d8ec64-ed9f-4338-b27a-8151379ca57b, reason: Instance d50d0791-c234-4390-a519-3ce1c8561824 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.470 2 DEBUG nova.network.neutron [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updated VIF entry in instance network info cache for port 47b2b5e3-3d12-4a83-9b8f-7229649e4bc6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.471 2 DEBUG nova.network.neutron [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updating instance_info_cache with network_info: [{"id": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "address": "fa:16:3e:3b:b3:fc", "network": {"id": "df8550c5-2041-43a8-b4e7-93cfe1bef135", "bridge": "br-int", "label": "tempest-network-smoke--522925216", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap47b2b5e3-3d", "ovs_interfaceid": "47b2b5e3-3d12-4a83-9b8f-7229649e4bc6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "address": "fa:16:3e:ab:65:f1", "network": {"id": "59555e18-9ca2-4493-b3a1-3bf0b720e9a1", "bridge": "br-int", "label": "tempest-network-smoke--755268223", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feab:65f1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap81d8ec64-ed", "ovs_interfaceid": "81d8ec64-ed9f-4338-b27a-8151379ca57b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.620 2 DEBUG oslo_concurrency.lockutils [req-8d3b3e90-2188-4c42-a793-f760590896ed req-d9b8b303-2fad-487d-acd3-c90d2b48315a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-d50d0791-c234-4390-a519-3ce1c8561824" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.639 2 DEBUG nova.network.neutron [-] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.674 2 INFO nova.compute.manager [-] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Took 1.42 seconds to deallocate network for instance.#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.727 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.728 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.791 2 DEBUG nova.compute.provider_tree [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.814 2 DEBUG nova.scheduler.client.report [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.835 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.904 2 INFO nova.scheduler.client.report [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Deleted allocations for instance d50d0791-c234-4390-a519-3ce1c8561824#033[00m
Oct  7 16:21:20 np0005474864 nova_compute[192593]: 2025-10-07 20:21:20.979 2 DEBUG oslo_concurrency.lockutils [None req-b25b8fce-f68b-4ef0-8596-08c63a5a2ce1 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.142s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.186 2 DEBUG nova.compute.manager [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-unplugged-81d8ec64-ed9f-4338-b27a-8151379ca57b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.187 2 DEBUG oslo_concurrency.lockutils [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.187 2 DEBUG oslo_concurrency.lockutils [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.187 2 DEBUG oslo_concurrency.lockutils [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.188 2 DEBUG nova.compute.manager [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] No waiting events found dispatching network-vif-unplugged-81d8ec64-ed9f-4338-b27a-8151379ca57b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.188 2 WARNING nova.compute.manager [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received unexpected event network-vif-unplugged-81d8ec64-ed9f-4338-b27a-8151379ca57b for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.189 2 DEBUG nova.compute.manager [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.189 2 DEBUG oslo_concurrency.lockutils [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "d50d0791-c234-4390-a519-3ce1c8561824-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.189 2 DEBUG oslo_concurrency.lockutils [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.190 2 DEBUG oslo_concurrency.lockutils [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "d50d0791-c234-4390-a519-3ce1c8561824-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.190 2 DEBUG nova.compute.manager [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] No waiting events found dispatching network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:21:21 np0005474864 nova_compute[192593]: 2025-10-07 20:21:21.191 2 WARNING nova.compute.manager [req-a5fee929-ed84-4210-af5b-0a11950fef2f req-6bfb2ad7-9b28-499e-a21d-7daa68400ed7 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received unexpected event network-vif-plugged-81d8ec64-ed9f-4338-b27a-8151379ca57b for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:21:22 np0005474864 nova_compute[192593]: 2025-10-07 20:21:22.344 2 DEBUG nova.compute.manager [req-4c93baaf-291e-44d8-8df2-1d75dcc0a3a1 req-435ec09b-2a40-415e-aeac-0faba8f08a0f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Received event network-vif-deleted-47b2b5e3-3d12-4a83-9b8f-7229649e4bc6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:22 np0005474864 podman[228499]: 2025-10-07 20:21:22.380591087 +0000 UTC m=+0.065516107 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 16:21:24 np0005474864 nova_compute[192593]: 2025-10-07 20:21:24.125 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:24 np0005474864 nova_compute[192593]: 2025-10-07 20:21:24.184 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.028 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.029 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.083 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.293 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.294 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.302 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.303 2 INFO nova.compute.claims [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:21:25 np0005474864 podman[228518]: 2025-10-07 20:21:25.379851138 +0000 UTC m=+0.064814418 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.499 2 DEBUG nova.compute.provider_tree [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.549 2 DEBUG nova.scheduler.client.report [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.614 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.320s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.615 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.676 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.677 2 DEBUG nova.network.neutron [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.767 2 INFO nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.791 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.877 2 DEBUG nova.policy [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': 'db22b0e0f6594362af24484ba9b01936', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.922 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.924 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.925 2 INFO nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Creating image(s)#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.926 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "/var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.926 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "/var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.927 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "/var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:25 np0005474864 nova_compute[192593]: 2025-10-07 20:21:25.943 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.023 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.024 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.025 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.045 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.140 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.141 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.191 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk 1073741824" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.192 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.168s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.193 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.272 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.079s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.273 2 DEBUG nova.virt.disk.api [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Checking if we can resize image /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.273 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.323 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.324 2 DEBUG nova.virt.disk.api [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Cannot resize image /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.325 2 DEBUG nova.objects.instance [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'migration_context' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.374 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.374 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Ensure instance console log exists: /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.375 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.375 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.376 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:26 np0005474864 nova_compute[192593]: 2025-10-07 20:21:26.686 2 DEBUG nova.network.neutron [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Successfully created port: 82ce1d9a-86a1-4412-a852-b1ee4b3139bf _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.116 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.117 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.117 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.118 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.363 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.364 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5726MB free_disk=73.46313858032227GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.364 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.365 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.424 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 36338d64-f0f0-468c-be11-8a124d76cb6e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.425 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.425 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.481 2 DEBUG nova.network.neutron [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Successfully updated port: 82ce1d9a-86a1-4412-a852-b1ee4b3139bf _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.485 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.501 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.502 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquired lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.502 2 DEBUG nova.network.neutron [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.505 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.540 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.540 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:27 np0005474864 nova_compute[192593]: 2025-10-07 20:21:27.743 2 DEBUG nova.network.neutron [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.542 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.542 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.633 2 DEBUG nova.network.neutron [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updating instance_info_cache with network_info: [{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.656 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Releasing lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.657 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Instance network_info: |[{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.661 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Start _get_guest_xml network_info=[{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.667 2 WARNING nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.673 2 DEBUG nova.virt.libvirt.host [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.674 2 DEBUG nova.virt.libvirt.host [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.679 2 DEBUG nova.virt.libvirt.host [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.679 2 DEBUG nova.virt.libvirt.host [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.681 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.682 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.682 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.683 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.683 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.683 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.684 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.684 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.685 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.685 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.686 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.686 2 DEBUG nova.virt.hardware [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.691 2 DEBUG nova.virt.libvirt.vif [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:21:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-883660464',display_name='tempest-TestNetworkAdvancedServerOps-server-883660464',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-883660464',id=45,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOyB3CcNORNcsfVL/w9sXfU3FdzkZ4AiGT/StOp9LHIsdKxlw+iLDlhVW1MIaui1RMyyiBSqaPsgoKcY2aLSu0V33V1Wp3zJIlOmQfh9LFe+fpXTG0rgU+nykyRls0Ao5w==',key_name='tempest-TestNetworkAdvancedServerOps-516914566',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-s2m74s4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:21:25Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=36338d64-f0f0-468c-be11-8a124d76cb6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.692 2 DEBUG nova.network.os_vif_util [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.693 2 DEBUG nova.network.os_vif_util [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.695 2 DEBUG nova.objects.instance [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'pci_devices' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.717 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <uuid>36338d64-f0f0-468c-be11-8a124d76cb6e</uuid>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <name>instance-0000002d</name>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-883660464</nova:name>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:21:28</nova:creationTime>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:user uuid="db22b0e0f6594362af24484ba9b01936">tempest-TestNetworkAdvancedServerOps-585003851-project-member</nova:user>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:project uuid="8a545a398e2e433bbe3f3dfa2ec4ebcb">tempest-TestNetworkAdvancedServerOps-585003851</nova:project>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        <nova:port uuid="82ce1d9a-86a1-4412-a852-b1ee4b3139bf">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <entry name="serial">36338d64-f0f0-468c-be11-8a124d76cb6e</entry>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <entry name="uuid">36338d64-f0f0-468c-be11-8a124d76cb6e</entry>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk.config"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:40:18:be"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <target dev="tap82ce1d9a-86"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/console.log" append="off"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:21:28 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:21:28 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:21:28 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:21:28 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.718 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Preparing to wait for external event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.719 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.719 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.719 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.721 2 DEBUG nova.virt.libvirt.vif [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:21:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-883660464',display_name='tempest-TestNetworkAdvancedServerOps-server-883660464',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-883660464',id=45,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOyB3CcNORNcsfVL/w9sXfU3FdzkZ4AiGT/StOp9LHIsdKxlw+iLDlhVW1MIaui1RMyyiBSqaPsgoKcY2aLSu0V33V1Wp3zJIlOmQfh9LFe+fpXTG0rgU+nykyRls0Ao5w==',key_name='tempest-TestNetworkAdvancedServerOps-516914566',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-s2m74s4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:21:25Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=36338d64-f0f0-468c-be11-8a124d76cb6e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.721 2 DEBUG nova.network.os_vif_util [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.722 2 DEBUG nova.network.os_vif_util [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.723 2 DEBUG os_vif [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.724 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.725 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.725 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.730 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.730 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82ce1d9a-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.731 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82ce1d9a-86, col_values=(('external_ids', {'iface-id': '82ce1d9a-86a1-4412-a852-b1ee4b3139bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:18:be', 'vm-uuid': '36338d64-f0f0-468c-be11-8a124d76cb6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:28 np0005474864 NetworkManager[51631]: <info>  [1759868488.7988] manager: (tap82ce1d9a-86): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.801 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.810 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.811 2 INFO os_vif [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86')#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.817 2 DEBUG nova.compute.manager [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-changed-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.818 2 DEBUG nova.compute.manager [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Refreshing instance network info cache due to event network-changed-82ce1d9a-86a1-4412-a852-b1ee4b3139bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.818 2 DEBUG oslo_concurrency.lockutils [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.819 2 DEBUG oslo_concurrency.lockutils [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.819 2 DEBUG nova.network.neutron [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Refreshing network info cache for port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.878 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.879 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.879 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] No VIF found with MAC fa:16:3e:40:18:be, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:21:28 np0005474864 nova_compute[192593]: 2025-10-07 20:21:28.880 2 INFO nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Using config drive#033[00m
Oct  7 16:21:29 np0005474864 nova_compute[192593]: 2025-10-07 20:21:29.126 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:29 np0005474864 nova_compute[192593]: 2025-10-07 20:21:29.388 2 INFO nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Creating config drive at /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk.config#033[00m
Oct  7 16:21:29 np0005474864 nova_compute[192593]: 2025-10-07 20:21:29.399 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgoyexr5 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:21:29 np0005474864 podman[228561]: 2025-10-07 20:21:29.410782083 +0000 UTC m=+0.096690135 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:21:29 np0005474864 nova_compute[192593]: 2025-10-07 20:21:29.542 2 DEBUG oslo_concurrency.processutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmgoyexr5" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:21:29 np0005474864 kernel: tap82ce1d9a-86: entered promiscuous mode
Oct  7 16:21:29 np0005474864 NetworkManager[51631]: <info>  [1759868489.6129] manager: (tap82ce1d9a-86): new Tun device (/org/freedesktop/NetworkManager/Devices/130)
Oct  7 16:21:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:29Z|00242|binding|INFO|Claiming lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf for this chassis.
Oct  7 16:21:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:29Z|00243|binding|INFO|82ce1d9a-86a1-4412-a852-b1ee4b3139bf: Claiming fa:16:3e:40:18:be 10.100.0.10
Oct  7 16:21:29 np0005474864 nova_compute[192593]: 2025-10-07 20:21:29.615 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.627 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:18:be 10.100.0.10'], port_security=['fa:16:3e:40:18:be 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '36338d64-f0f0-468c-be11-8a124d76cb6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'bc1c8622-dacf-40f5-b88e-b142e1321b0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51fe249a-d1c8-46e8-8b8e-10a8ba338db8, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=82ce1d9a-86a1-4412-a852-b1ee4b3139bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.628 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf in datapath 7a72b09b-88c3-4d24-8cee-ec65d8210d47 bound to our chassis#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.630 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a72b09b-88c3-4d24-8cee-ec65d8210d47#033[00m
Oct  7 16:21:29 np0005474864 systemd-udevd[228599]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.647 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2b0a9c1e-af65-49a7-90a8-36fe09ff64f9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.648 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7a72b09b-81 in ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:21:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:29Z|00244|binding|INFO|Setting lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf ovn-installed in OVS
Oct  7 16:21:29 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:29Z|00245|binding|INFO|Setting lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf up in Southbound
Oct  7 16:21:29 np0005474864 nova_compute[192593]: 2025-10-07 20:21:29.652 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.651 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7a72b09b-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.651 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3348ed7d-b688-4aea-b925-3d750d5bdade]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.653 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a32e63a3-9c89-4150-abba-bf92e1471871]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 NetworkManager[51631]: <info>  [1759868489.6551] device (tap82ce1d9a-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:21:29 np0005474864 NetworkManager[51631]: <info>  [1759868489.6574] device (tap82ce1d9a-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.669 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[91040d8e-018f-416b-bdd9-742510a55301]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 systemd-machined[152586]: New machine qemu-16-instance-0000002d.
Oct  7 16:21:29 np0005474864 systemd[1]: Started Virtual Machine qemu-16-instance-0000002d.
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.694 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4a65f64c-404b-4db9-8123-533878e955b6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.732 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[a052efed-eeb7-4f52-877e-337cc1bc1bfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.740 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[383e9e46-d3a1-473d-a1fe-dc30375b8549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 NetworkManager[51631]: <info>  [1759868489.7424] manager: (tap7a72b09b-80): new Veth device (/org/freedesktop/NetworkManager/Devices/131)
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.775 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[69257cb7-d3cf-4f0b-b43d-2e7e11278d6d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.779 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[d8990235-7122-44b3-b24a-21126c2ef71d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 NetworkManager[51631]: <info>  [1759868489.8076] device (tap7a72b09b-80): carrier: link connected
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.817 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[4be32b4f-9208-41e3-a8ca-81d0cb93d1f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.841 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[3ac227c3-5af1-4ecd-afd1-2ad5c4416f6d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a72b09b-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:70:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412570, 'reachable_time': 21577, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228632, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.866 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[339edcf7-e3ed-4ef7-be0a-be99552aedaa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:7014'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 412570, 'tstamp': 412570}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 228633, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.893 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[40ac3d1a-0cc7-4c05-b16a-da672069efd7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a72b09b-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:70:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 74], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412570, 'reachable_time': 21577, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 228634, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:29 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:29.946 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4ebff92f-68b1-44ee-ae00-07ec8991da29]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.049 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[118e116c-8d7a-492f-8855-053756989960]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.051 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a72b09b-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.051 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.052 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a72b09b-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:30 np0005474864 kernel: tap7a72b09b-80: entered promiscuous mode
Oct  7 16:21:30 np0005474864 NetworkManager[51631]: <info>  [1759868490.1019] manager: (tap7a72b09b-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/132)
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.100 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.104 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a72b09b-80, col_values=(('external_ids', {'iface-id': 'd5f080d2-c8a7-4a60-b03d-eaf0f085bebb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.104 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:30 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:30Z|00246|binding|INFO|Releasing lport d5f080d2-c8a7-4a60-b03d-eaf0f085bebb from this chassis (sb_readonly=0)
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.108 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.124 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.125 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7a72b09b-88c3-4d24-8cee-ec65d8210d47.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7a72b09b-88c3-4d24-8cee-ec65d8210d47.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.126 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[43b3a877-9b06-4aab-80fb-240b67361e72]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.126 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-7a72b09b-88c3-4d24-8cee-ec65d8210d47
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/7a72b09b-88c3-4d24-8cee-ec65d8210d47.pid.haproxy
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 7a72b09b-88c3-4d24-8cee-ec65d8210d47
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:21:30 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:30.127 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'env', 'PROCESS_TAG=haproxy-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7a72b09b-88c3-4d24-8cee-ec65d8210d47.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.470 2 DEBUG nova.network.neutron [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updated VIF entry in instance network info cache for port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.470 2 DEBUG nova.network.neutron [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updating instance_info_cache with network_info: [{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.495 2 DEBUG oslo_concurrency.lockutils [req-f1232609-7ff4-448d-9f56-3a9aaac97802 req-1586ad5f-0279-4c53-9e24-609c0997b054 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:21:30 np0005474864 podman[228673]: 2025-10-07 20:21:30.539395801 +0000 UTC m=+0.060530284 container create b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  7 16:21:30 np0005474864 systemd[1]: Started libpod-conmon-b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615.scope.
Oct  7 16:21:30 np0005474864 podman[228673]: 2025-10-07 20:21:30.509588452 +0000 UTC m=+0.030722965 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:21:30 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:21:30 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/fefce3f9f111b0017c8ae02314bb70df2ec1849ad227a306c9a04e815b04ad2a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:21:30 np0005474864 podman[228673]: 2025-10-07 20:21:30.626850209 +0000 UTC m=+0.147984712 container init b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  7 16:21:30 np0005474864 podman[228673]: 2025-10-07 20:21:30.631957876 +0000 UTC m=+0.153092359 container start b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:21:30 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[228689]: [NOTICE]   (228693) : New worker (228695) forked
Oct  7 16:21:30 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[228689]: [NOTICE]   (228693) : Loading success.
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.864 2 DEBUG nova.compute.manager [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.865 2 DEBUG oslo_concurrency.lockutils [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.865 2 DEBUG oslo_concurrency.lockutils [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.866 2 DEBUG oslo_concurrency.lockutils [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.867 2 DEBUG nova.compute.manager [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Processing event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.867 2 DEBUG nova.compute.manager [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.867 2 DEBUG oslo_concurrency.lockutils [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.868 2 DEBUG oslo_concurrency.lockutils [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.868 2 DEBUG oslo_concurrency.lockutils [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.869 2 DEBUG nova.compute.manager [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] No waiting events found dispatching network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.869 2 WARNING nova.compute.manager [req-b31c1a71-eff9-4d6a-9125-a559f23c46f7 req-eefc4f86-1e88-4392-aedc-145d285701d8 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received unexpected event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf for instance with vm_state building and task_state spawning.#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.932 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.933 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868490.931713, 36338d64-f0f0-468c-be11-8a124d76cb6e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.934 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] VM Started (Lifecycle Event)#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.938 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.942 2 INFO nova.virt.libvirt.driver [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Instance spawned successfully.#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.943 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.979 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.989 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.994 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.994 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.995 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.996 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.997 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:21:30 np0005474864 nova_compute[192593]: 2025-10-07 20:21:30.998 2 DEBUG nova.virt.libvirt.driver [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.030 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.031 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868490.9329178, 36338d64-f0f0-468c-be11-8a124d76cb6e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.031 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.067 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.071 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868490.9374897, 36338d64-f0f0-468c-be11-8a124d76cb6e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.071 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.082 2 INFO nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Took 5.16 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.083 2 DEBUG nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.094 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.098 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.134 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.171 2 INFO nova.compute.manager [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Took 5.90 seconds to build instance.#033[00m
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.187 2 DEBUG oslo_concurrency.lockutils [None req-c78ba9f3-904d-4f7a-8ff5-d5687cdcf5e2 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:31Z|00247|binding|INFO|Releasing lport d5f080d2-c8a7-4a60-b03d-eaf0f085bebb from this chassis (sb_readonly=0)
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.494 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:31Z|00248|binding|INFO|Releasing lport d5f080d2-c8a7-4a60-b03d-eaf0f085bebb from this chassis (sb_readonly=0)
Oct  7 16:21:31 np0005474864 nova_compute[192593]: 2025-10-07 20:21:31.641 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:32 np0005474864 nova_compute[192593]: 2025-10-07 20:21:32.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:32 np0005474864 nova_compute[192593]: 2025-10-07 20:21:32.095 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:21:32 np0005474864 nova_compute[192593]: 2025-10-07 20:21:32.095 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:21:32 np0005474864 nova_compute[192593]: 2025-10-07 20:21:32.427 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:21:32 np0005474864 nova_compute[192593]: 2025-10-07 20:21:32.427 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquired lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:21:32 np0005474864 nova_compute[192593]: 2025-10-07 20:21:32.427 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  7 16:21:32 np0005474864 nova_compute[192593]: 2025-10-07 20:21:32.427 2 DEBUG nova.objects.instance [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:33 np0005474864 nova_compute[192593]: 2025-10-07 20:21:33.797 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.129 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.141 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868479.1403437, d50d0791-c234-4390-a519-3ce1c8561824 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.142 2 INFO nova.compute.manager [-] [instance: d50d0791-c234-4390-a519-3ce1c8561824] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.167 2 DEBUG nova.compute.manager [None req-1908636a-729b-4752-a153-fa30e958fdf5 - - - - - -] [instance: d50d0791-c234-4390-a519-3ce1c8561824] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.417 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updating instance_info_cache with network_info: [{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.451 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Releasing lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.452 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.453 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.454 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:21:34 np0005474864 NetworkManager[51631]: <info>  [1759868494.9974] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/133)
Oct  7 16:21:34 np0005474864 nova_compute[192593]: 2025-10-07 20:21:34.997 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:34 np0005474864 NetworkManager[51631]: <info>  [1759868494.9993] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.105 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:35 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:35Z|00249|binding|INFO|Releasing lport d5f080d2-c8a7-4a60-b03d-eaf0f085bebb from this chassis (sb_readonly=0)
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.116 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.902 2 DEBUG nova.compute.manager [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-changed-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.904 2 DEBUG nova.compute.manager [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Refreshing instance network info cache due to event network-changed-82ce1d9a-86a1-4412-a852-b1ee4b3139bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.904 2 DEBUG oslo_concurrency.lockutils [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.905 2 DEBUG oslo_concurrency.lockutils [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:21:35 np0005474864 nova_compute[192593]: 2025-10-07 20:21:35.905 2 DEBUG nova.network.neutron [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Refreshing network info cache for port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:21:37 np0005474864 nova_compute[192593]: 2025-10-07 20:21:37.574 2 DEBUG nova.network.neutron [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updated VIF entry in instance network info cache for port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:21:37 np0005474864 nova_compute[192593]: 2025-10-07 20:21:37.574 2 DEBUG nova.network.neutron [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updating instance_info_cache with network_info: [{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:37 np0005474864 nova_compute[192593]: 2025-10-07 20:21:37.608 2 DEBUG oslo_concurrency.lockutils [req-09699225-de34-4aef-b0f7-853c4f2735ee req-8bc9f001-a7e4-449d-8cc3-178d02f860fc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:21:38 np0005474864 nova_compute[192593]: 2025-10-07 20:21:38.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:39 np0005474864 nova_compute[192593]: 2025-10-07 20:21:39.133 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:42 np0005474864 podman[228725]: 2025-10-07 20:21:42.419460634 +0000 UTC m=+0.098456456 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:21:42 np0005474864 podman[228726]: 2025-10-07 20:21:42.42836675 +0000 UTC m=+0.104803689 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, architecture=x86_64, config_id=edpm, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Oct  7 16:21:42 np0005474864 nova_compute[192593]: 2025-10-07 20:21:42.768 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:43 np0005474864 nova_compute[192593]: 2025-10-07 20:21:43.807 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:44 np0005474864 nova_compute[192593]: 2025-10-07 20:21:44.135 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:44Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:40:18:be 10.100.0.10
Oct  7 16:21:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:44Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:40:18:be 10.100.0.10
Oct  7 16:21:45 np0005474864 nova_compute[192593]: 2025-10-07 20:21:45.784 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:46 np0005474864 podman[228768]: 2025-10-07 20:21:46.420989254 +0000 UTC m=+0.103925223 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:21:46 np0005474864 podman[228770]: 2025-10-07 20:21:46.429473099 +0000 UTC m=+0.095907263 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, container_name=multipathd)
Oct  7 16:21:46 np0005474864 podman[228769]: 2025-10-07 20:21:46.442556775 +0000 UTC m=+0.129676784 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:21:47 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:47.231 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:f3:35 10.100.0.2 2001:db8::f816:3eff:fec1:f335'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec1:f335/64', 'neutron:device_id': 'ovnmeta-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e359333-df38-4aab-aafe-7e374b5b3f6c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bbf6238c-d2ee-49d7-8d9d-bee291ff7c80) old=Port_Binding(mac=['fa:16:3e:c1:f3:35 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:21:47 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:47.234 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bbf6238c-d2ee-49d7-8d9d-bee291ff7c80 in datapath 3d8c8c07-c024-4df2-a666-f1bfe0c33596 updated#033[00m
Oct  7 16:21:47 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:47.237 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d8c8c07-c024-4df2-a666-f1bfe0c33596, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:21:47 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:47.238 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[402b1db6-5be2-4b6e-9c15-281440e63c90]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:48 np0005474864 nova_compute[192593]: 2025-10-07 20:21:48.811 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:49 np0005474864 nova_compute[192593]: 2025-10-07 20:21:49.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.173 2 INFO nova.compute.manager [None req-6fa9689e-0823-4e85-a3a7-5e57d792795f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Get console output#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.180 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.491 2 DEBUG nova.objects.instance [None req-f5b93f72-fdbb-47fe-b3fc-47d78e85f30a db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'pci_devices' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.523 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868511.5228972, 36338d64-f0f0-468c-be11-8a124d76cb6e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.525 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.546 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.553 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:21:51 np0005474864 nova_compute[192593]: 2025-10-07 20:21:51.577 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Oct  7 16:21:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:51.675 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c1:f3:35 10.100.0.2 2001:db8:0:1:f816:3eff:fec1:f335 2001:db8::f816:3eff:fec1:f335'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fec1:f335/64 2001:db8::f816:3eff:fec1:f335/64', 'neutron:device_id': 'ovnmeta-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3e359333-df38-4aab-aafe-7e374b5b3f6c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=bbf6238c-d2ee-49d7-8d9d-bee291ff7c80) old=Port_Binding(mac=['fa:16:3e:c1:f3:35 10.100.0.2 2001:db8::f816:3eff:fec1:f335'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fec1:f335/64', 'neutron:device_id': 'ovnmeta-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3d8c8c07-c024-4df2-a666-f1bfe0c33596', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:21:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:51.678 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port bbf6238c-d2ee-49d7-8d9d-bee291ff7c80 in datapath 3d8c8c07-c024-4df2-a666-f1bfe0c33596 updated#033[00m
Oct  7 16:21:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:51.680 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3d8c8c07-c024-4df2-a666-f1bfe0c33596, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:21:51 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:51.681 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[43f14384-436b-420a-acb9-19c6145345e7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 kernel: tap82ce1d9a-86 (unregistering): left promiscuous mode
Oct  7 16:21:52 np0005474864 NetworkManager[51631]: <info>  [1759868512.3783] device (tap82ce1d9a-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:21:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:52Z|00250|binding|INFO|Releasing lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf from this chassis (sb_readonly=0)
Oct  7 16:21:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:52Z|00251|binding|INFO|Setting lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf down in Southbound
Oct  7 16:21:52 np0005474864 nova_compute[192593]: 2025-10-07 20:21:52.398 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:52 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:52Z|00252|binding|INFO|Removing iface tap82ce1d9a-86 ovn-installed in OVS
Oct  7 16:21:52 np0005474864 nova_compute[192593]: 2025-10-07 20:21:52.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.408 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:18:be 10.100.0.10'], port_security=['fa:16:3e:40:18:be 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '36338d64-f0f0-468c-be11-8a124d76cb6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'bc1c8622-dacf-40f5-b88e-b142e1321b0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51fe249a-d1c8-46e8-8b8e-10a8ba338db8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=82ce1d9a-86a1-4412-a852-b1ee4b3139bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.410 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf in datapath 7a72b09b-88c3-4d24-8cee-ec65d8210d47 unbound from our chassis#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.412 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a72b09b-88c3-4d24-8cee-ec65d8210d47, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.413 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[458f75be-84c7-46ed-af17-f94096b5749e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.414 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 namespace which is not needed anymore#033[00m
Oct  7 16:21:52 np0005474864 nova_compute[192593]: 2025-10-07 20:21:52.418 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:52 np0005474864 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Oct  7 16:21:52 np0005474864 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d0000002d.scope: Consumed 13.885s CPU time.
Oct  7 16:21:52 np0005474864 systemd-machined[152586]: Machine qemu-16-instance-0000002d terminated.
Oct  7 16:21:52 np0005474864 podman[228837]: 2025-10-07 20:21:52.528171343 +0000 UTC m=+0.095224263 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  7 16:21:52 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[228689]: [NOTICE]   (228693) : haproxy version is 2.8.14-c23fe91
Oct  7 16:21:52 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[228689]: [NOTICE]   (228693) : path to executable is /usr/sbin/haproxy
Oct  7 16:21:52 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[228689]: [WARNING]  (228693) : Exiting Master process...
Oct  7 16:21:52 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[228689]: [ALERT]    (228693) : Current worker (228695) exited with code 143 (Terminated)
Oct  7 16:21:52 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[228689]: [WARNING]  (228693) : All workers exited. Exiting... (0)
Oct  7 16:21:52 np0005474864 systemd[1]: libpod-b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615.scope: Deactivated successfully.
Oct  7 16:21:52 np0005474864 podman[228877]: 2025-10-07 20:21:52.602436751 +0000 UTC m=+0.065026463 container died b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001)
Oct  7 16:21:52 np0005474864 nova_compute[192593]: 2025-10-07 20:21:52.626 2 DEBUG nova.compute.manager [None req-f5b93f72-fdbb-47fe-b3fc-47d78e85f30a db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:52 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615-userdata-shm.mount: Deactivated successfully.
Oct  7 16:21:52 np0005474864 systemd[1]: var-lib-containers-storage-overlay-fefce3f9f111b0017c8ae02314bb70df2ec1849ad227a306c9a04e815b04ad2a-merged.mount: Deactivated successfully.
Oct  7 16:21:52 np0005474864 podman[228877]: 2025-10-07 20:21:52.651337679 +0000 UTC m=+0.113927301 container cleanup b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:21:52 np0005474864 systemd[1]: libpod-conmon-b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615.scope: Deactivated successfully.
Oct  7 16:21:52 np0005474864 podman[228921]: 2025-10-07 20:21:52.749729302 +0000 UTC m=+0.066981500 container remove b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.759 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[11f81aaa-d468-41e9-84c0-60740bf97287]: (4, ('Tue Oct  7 08:21:52 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 (b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615)\nb7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615\nTue Oct  7 08:21:52 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 (b7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615)\nb7033b14cdc36fc1e4be49662b062475ff900116966061e63d4a87ea2425e615\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.762 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f20ff992-8ee2-4071-a264-9069d4882ca3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.763 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a72b09b-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:52 np0005474864 nova_compute[192593]: 2025-10-07 20:21:52.766 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:52 np0005474864 kernel: tap7a72b09b-80: left promiscuous mode
Oct  7 16:21:52 np0005474864 nova_compute[192593]: 2025-10-07 20:21:52.798 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.802 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8db00724-08da-436f-b1c1-87848e28c50a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.829 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[e228ba20-8972-4a16-8162-795907183afa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.831 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[52ababbe-4f46-4e2c-b456-26f22d406d87]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.858 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d9a29d79-eb86-46fb-b001-92f982717e74]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 412562, 'reachable_time': 33705, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 228938, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:52 np0005474864 systemd[1]: run-netns-ovnmeta\x2d7a72b09b\x2d88c3\x2d4d24\x2d8cee\x2dec65d8210d47.mount: Deactivated successfully.
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.861 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:21:52 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:52.861 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[578e0f6c-6b6c-46c1-9974-2d995e9a726f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:53 np0005474864 nova_compute[192593]: 2025-10-07 20:21:53.814 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:54 np0005474864 nova_compute[192593]: 2025-10-07 20:21:54.139 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:54 np0005474864 nova_compute[192593]: 2025-10-07 20:21:54.760 2 INFO nova.compute.manager [None req-e8e12af6-c395-439d-97ce-1f17aeaf7b88 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Get console output#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.100 2 INFO nova.compute.manager [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Resuming#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.101 2 DEBUG nova.objects.instance [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'flavor' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.157 2 DEBUG oslo_concurrency.lockutils [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.157 2 DEBUG oslo_concurrency.lockutils [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquired lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.157 2 DEBUG nova.network.neutron [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.384 2 DEBUG nova.compute.manager [req-5a000b38-ebdc-491d-b38f-73e33a78924f req-0b267902-2aa2-4439-ab38-ead9700fd0f4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-unplugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.384 2 DEBUG oslo_concurrency.lockutils [req-5a000b38-ebdc-491d-b38f-73e33a78924f req-0b267902-2aa2-4439-ab38-ead9700fd0f4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.385 2 DEBUG oslo_concurrency.lockutils [req-5a000b38-ebdc-491d-b38f-73e33a78924f req-0b267902-2aa2-4439-ab38-ead9700fd0f4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.385 2 DEBUG oslo_concurrency.lockutils [req-5a000b38-ebdc-491d-b38f-73e33a78924f req-0b267902-2aa2-4439-ab38-ead9700fd0f4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.386 2 DEBUG nova.compute.manager [req-5a000b38-ebdc-491d-b38f-73e33a78924f req-0b267902-2aa2-4439-ab38-ead9700fd0f4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] No waiting events found dispatching network-vif-unplugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:21:55 np0005474864 nova_compute[192593]: 2025-10-07 20:21:55.386 2 WARNING nova.compute.manager [req-5a000b38-ebdc-491d-b38f-73e33a78924f req-0b267902-2aa2-4439-ab38-ead9700fd0f4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received unexpected event network-vif-unplugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf for instance with vm_state suspended and task_state resuming.#033[00m
Oct  7 16:21:56 np0005474864 podman[228939]: 2025-10-07 20:21:56.392829851 +0000 UTC m=+0.078909003 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.541 2 DEBUG nova.network.neutron [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updating instance_info_cache with network_info: [{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.561 2 DEBUG oslo_concurrency.lockutils [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Releasing lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.569 2 DEBUG nova.virt.libvirt.vif [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:21:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-883660464',display_name='tempest-TestNetworkAdvancedServerOps-server-883660464',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-883660464',id=45,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOyB3CcNORNcsfVL/w9sXfU3FdzkZ4AiGT/StOp9LHIsdKxlw+iLDlhVW1MIaui1RMyyiBSqaPsgoKcY2aLSu0V33V1Wp3zJIlOmQfh9LFe+fpXTG0rgU+nykyRls0Ao5w==',key_name='tempest-TestNetworkAdvancedServerOps-516914566',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:21:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-s2m74s4z',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:21:52Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=36338d64-f0f0-468c-be11-8a124d76cb6e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.570 2 DEBUG nova.network.os_vif_util [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.571 2 DEBUG nova.network.os_vif_util [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.572 2 DEBUG os_vif [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.573 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.573 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.574 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap82ce1d9a-86, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.581 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap82ce1d9a-86, col_values=(('external_ids', {'iface-id': '82ce1d9a-86a1-4412-a852-b1ee4b3139bf', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:40:18:be', 'vm-uuid': '36338d64-f0f0-468c-be11-8a124d76cb6e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.582 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.583 2 INFO os_vif [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86')#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.609 2 DEBUG nova.objects.instance [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'numa_topology' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:56 np0005474864 kernel: tap82ce1d9a-86: entered promiscuous mode
Oct  7 16:21:56 np0005474864 NetworkManager[51631]: <info>  [1759868516.7120] manager: (tap82ce1d9a-86): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.716 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:56Z|00253|binding|INFO|Claiming lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf for this chassis.
Oct  7 16:21:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:56Z|00254|binding|INFO|82ce1d9a-86a1-4412-a852-b1ee4b3139bf: Claiming fa:16:3e:40:18:be 10.100.0.10
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.720 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.727 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:18:be 10.100.0.10'], port_security=['fa:16:3e:40:18:be 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '36338d64-f0f0-468c-be11-8a124d76cb6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'bc1c8622-dacf-40f5-b88e-b142e1321b0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.175'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51fe249a-d1c8-46e8-8b8e-10a8ba338db8, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=82ce1d9a-86a1-4412-a852-b1ee4b3139bf) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.729 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf in datapath 7a72b09b-88c3-4d24-8cee-ec65d8210d47 bound to our chassis#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.732 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7a72b09b-88c3-4d24-8cee-ec65d8210d47#033[00m
Oct  7 16:21:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:56Z|00255|binding|INFO|Setting lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf ovn-installed in OVS
Oct  7 16:21:56 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:56Z|00256|binding|INFO|Setting lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf up in Southbound
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.740 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:56 np0005474864 nova_compute[192593]: 2025-10-07 20:21:56.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.750 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[31be2652-c4a5-4652-82a1-168f43d373f7]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.752 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7a72b09b-81 in ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.754 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7a72b09b-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.755 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[87d00b9b-ea31-462a-ac0d-f692f3778c2d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.756 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4ec0e847-d6ec-4d59-aa8e-4bc954ab2497]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 systemd-udevd[228981]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:21:56 np0005474864 systemd-machined[152586]: New machine qemu-17-instance-0000002d.
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.777 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[ca44842e-46a0-434a-a81e-468a9925e54c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 NetworkManager[51631]: <info>  [1759868516.7832] device (tap82ce1d9a-86): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:21:56 np0005474864 NetworkManager[51631]: <info>  [1759868516.7839] device (tap82ce1d9a-86): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:21:56 np0005474864 systemd[1]: Started Virtual Machine qemu-17-instance-0000002d.
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.811 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8c21f196-145a-4460-8226-22ef29b10c8c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.858 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[c6f44bb9-557b-42a2-816d-a90eddd8cb5c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.866 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[f9301d01-c5ff-4c8d-9bfa-cea65a82face]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 NetworkManager[51631]: <info>  [1759868516.8672] manager: (tap7a72b09b-80): new Veth device (/org/freedesktop/NetworkManager/Devices/136)
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.909 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[2a4ed1ef-796e-45b2-9fb7-5107a30d2c93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.913 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[87fa0594-3c93-4571-8c3a-176ea942a7da]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 NetworkManager[51631]: <info>  [1759868516.9465] device (tap7a72b09b-80): carrier: link connected
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.953 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[04df73e1-9fb1-40c3-9755-19517c36b8e9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.971 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[047ade0c-3408-4a78-8941-275d0b536153]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a72b09b-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:70:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415283, 'reachable_time': 35907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229012, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:56 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:56.992 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2ea387e3-7469-4edb-9e5b-2bda0faac5fa]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe5c:7014'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 415283, 'tstamp': 415283}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229013, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.013 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[27abc4e5-0672-4bf2-94e3-f773b4da3ff0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7a72b09b-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:5c:70:14'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 77], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415283, 'reachable_time': 35907, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229014, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.061 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2feb4985-c10b-4ef8-b706-947e48fee0d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.138 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[48d5f7ed-1e2b-44b5-b836-6738891e77d3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.140 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a72b09b-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.141 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.142 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7a72b09b-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.144 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:57 np0005474864 NetworkManager[51631]: <info>  [1759868517.1456] manager: (tap7a72b09b-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/137)
Oct  7 16:21:57 np0005474864 kernel: tap7a72b09b-80: entered promiscuous mode
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.147 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.149 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7a72b09b-80, col_values=(('external_ids', {'iface-id': 'd5f080d2-c8a7-4a60-b03d-eaf0f085bebb'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.150 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:21:57Z|00257|binding|INFO|Releasing lport d5f080d2-c8a7-4a60-b03d-eaf0f085bebb from this chassis (sb_readonly=0)
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.176 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.178 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7a72b09b-88c3-4d24-8cee-ec65d8210d47.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7a72b09b-88c3-4d24-8cee-ec65d8210d47.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.179 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8bb4bb-31ba-4776-b0c0-7d77dd9aa789]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.180 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-7a72b09b-88c3-4d24-8cee-ec65d8210d47
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/7a72b09b-88c3-4d24-8cee-ec65d8210d47.pid.haproxy
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 7a72b09b-88c3-4d24-8cee-ec65d8210d47
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:21:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:21:57.182 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'env', 'PROCESS_TAG=haproxy-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7a72b09b-88c3-4d24-8cee-ec65d8210d47.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.530 2 DEBUG nova.compute.manager [req-eef543ec-a204-4a80-81e8-307658a06a2a req-0bf987b4-5d17-4c5e-8cac-e6c35c6d5e12 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.531 2 DEBUG oslo_concurrency.lockutils [req-eef543ec-a204-4a80-81e8-307658a06a2a req-0bf987b4-5d17-4c5e-8cac-e6c35c6d5e12 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.531 2 DEBUG oslo_concurrency.lockutils [req-eef543ec-a204-4a80-81e8-307658a06a2a req-0bf987b4-5d17-4c5e-8cac-e6c35c6d5e12 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.532 2 DEBUG oslo_concurrency.lockutils [req-eef543ec-a204-4a80-81e8-307658a06a2a req-0bf987b4-5d17-4c5e-8cac-e6c35c6d5e12 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.532 2 DEBUG nova.compute.manager [req-eef543ec-a204-4a80-81e8-307658a06a2a req-0bf987b4-5d17-4c5e-8cac-e6c35c6d5e12 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] No waiting events found dispatching network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:21:57 np0005474864 nova_compute[192593]: 2025-10-07 20:21:57.533 2 WARNING nova.compute.manager [req-eef543ec-a204-4a80-81e8-307658a06a2a req-0bf987b4-5d17-4c5e-8cac-e6c35c6d5e12 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received unexpected event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf for instance with vm_state suspended and task_state resuming.#033[00m
Oct  7 16:21:57 np0005474864 podman[229053]: 2025-10-07 20:21:57.613363375 +0000 UTC m=+0.067368611 container create 4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:21:57 np0005474864 systemd[1]: Started libpod-conmon-4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9.scope.
Oct  7 16:21:57 np0005474864 podman[229053]: 2025-10-07 20:21:57.58365252 +0000 UTC m=+0.037657786 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:21:57 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:21:57 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/67edd00aed64311ff294bb445707e332abc8ae728833e19fcecce1988ded5cda/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:21:57 np0005474864 podman[229053]: 2025-10-07 20:21:57.715764643 +0000 UTC m=+0.169769959 container init 4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:21:57 np0005474864 podman[229053]: 2025-10-07 20:21:57.723167416 +0000 UTC m=+0.177172672 container start 4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, io.buildah.version=1.41.3)
Oct  7 16:21:57 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[229069]: [NOTICE]   (229073) : New worker (229075) forked
Oct  7 16:21:57 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[229069]: [NOTICE]   (229073) : Loading success.
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.165 2 DEBUG nova.virt.libvirt.host [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Removed pending event for 36338d64-f0f0-468c-be11-8a124d76cb6e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.165 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868518.164811, 36338d64-f0f0-468c-be11-8a124d76cb6e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.166 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] VM Started (Lifecycle Event)#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.200 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.214 2 DEBUG nova.compute.manager [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.214 2 DEBUG nova.objects.instance [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'pci_devices' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.217 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.236 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.236 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868518.1724887, 36338d64-f0f0-468c-be11-8a124d76cb6e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.236 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.240 2 INFO nova.virt.libvirt.driver [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Instance running successfully.#033[00m
Oct  7 16:21:58 np0005474864 virtqemud[192092]: argument unsupported: QEMU guest agent is not configured
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.243 2 DEBUG nova.virt.libvirt.guest [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.243 2 DEBUG nova.compute.manager [None req-d1bdc967-a353-4ae3-8391-db410466edc5 db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.259 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.262 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.287 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Oct  7 16:21:58 np0005474864 nova_compute[192593]: 2025-10-07 20:21:58.816 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.141 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.739 2 DEBUG nova.compute.manager [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.740 2 DEBUG oslo_concurrency.lockutils [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.740 2 DEBUG oslo_concurrency.lockutils [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.741 2 DEBUG oslo_concurrency.lockutils [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.741 2 DEBUG nova.compute.manager [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] No waiting events found dispatching network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.742 2 WARNING nova.compute.manager [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received unexpected event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf for instance with vm_state active and task_state None.#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.742 2 DEBUG nova.compute.manager [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.743 2 DEBUG oslo_concurrency.lockutils [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.744 2 DEBUG oslo_concurrency.lockutils [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.744 2 DEBUG oslo_concurrency.lockutils [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.744 2 DEBUG nova.compute.manager [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] No waiting events found dispatching network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:21:59 np0005474864 nova_compute[192593]: 2025-10-07 20:21:59.745 2 WARNING nova.compute.manager [req-88c1d940-5531-480e-8940-255df4de8e26 req-180a0c40-e19d-493c-9a3d-b5fa23504b0d 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received unexpected event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf for instance with vm_state active and task_state None.#033[00m
Oct  7 16:22:00 np0005474864 podman[229084]: 2025-10-07 20:22:00.408169519 +0000 UTC m=+0.096530730 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  7 16:22:03 np0005474864 nova_compute[192593]: 2025-10-07 20:22:03.817 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:03 np0005474864 nova_compute[192593]: 2025-10-07 20:22:03.917 2 INFO nova.compute.manager [None req-fbe650dc-aeb4-4058-8ec7-0c0e709af4fa db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Get console output#033[00m
Oct  7 16:22:03 np0005474864 nova_compute[192593]: 2025-10-07 20:22:03.925 55 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Oct  7 16:22:04 np0005474864 nova_compute[192593]: 2025-10-07 20:22:04.173 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.022 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.024 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.024 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.025 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.025 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.027 2 INFO nova.compute.manager [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Terminating instance#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.029 2 DEBUG nova.compute.manager [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:22:05 np0005474864 kernel: tap82ce1d9a-86 (unregistering): left promiscuous mode
Oct  7 16:22:05 np0005474864 NetworkManager[51631]: <info>  [1759868525.0574] device (tap82ce1d9a-86): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.067 2 DEBUG nova.compute.manager [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-changed-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.068 2 DEBUG nova.compute.manager [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Refreshing instance network info cache due to event network-changed-82ce1d9a-86a1-4412-a852-b1ee4b3139bf. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.069 2 DEBUG oslo_concurrency.lockutils [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.070 2 DEBUG oslo_concurrency.lockutils [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.070 2 DEBUG nova.network.neutron [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Refreshing network info cache for port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:22:05 np0005474864 ovn_controller[94801]: 2025-10-07T20:22:05Z|00258|binding|INFO|Releasing lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf from this chassis (sb_readonly=0)
Oct  7 16:22:05 np0005474864 ovn_controller[94801]: 2025-10-07T20:22:05Z|00259|binding|INFO|Setting lport 82ce1d9a-86a1-4412-a852-b1ee4b3139bf down in Southbound
Oct  7 16:22:05 np0005474864 ovn_controller[94801]: 2025-10-07T20:22:05Z|00260|binding|INFO|Removing iface tap82ce1d9a-86 ovn-installed in OVS
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.072 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:05.082 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:40:18:be 10.100.0.10'], port_security=['fa:16:3e:40:18:be 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '36338d64-f0f0-468c-be11-8a124d76cb6e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '8a545a398e2e433bbe3f3dfa2ec4ebcb', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'bc1c8622-dacf-40f5-b88e-b142e1321b0f', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=51fe249a-d1c8-46e8-8b8e-10a8ba338db8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=82ce1d9a-86a1-4412-a852-b1ee4b3139bf) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:22:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:05.090 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf in datapath 7a72b09b-88c3-4d24-8cee-ec65d8210d47 unbound from our chassis#033[00m
Oct  7 16:22:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:05.092 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7a72b09b-88c3-4d24-8cee-ec65d8210d47, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:22:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:05.093 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[816f40a5-0f02-42bc-96b5-ee24d267a7aa]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:05 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:05.094 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 namespace which is not needed anymore#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:05 np0005474864 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002d.scope: Deactivated successfully.
Oct  7 16:22:05 np0005474864 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d0000002d.scope: Consumed 1.651s CPU time.
Oct  7 16:22:05 np0005474864 systemd-machined[152586]: Machine qemu-17-instance-0000002d terminated.
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.291 2 INFO nova.virt.libvirt.driver [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Instance destroyed successfully.#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.291 2 DEBUG nova.objects.instance [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lazy-loading 'resources' on Instance uuid 36338d64-f0f0-468c-be11-8a124d76cb6e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.306 2 DEBUG nova.virt.libvirt.vif [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:21:23Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-883660464',display_name='tempest-TestNetworkAdvancedServerOps-server-883660464',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-883660464',id=45,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBOyB3CcNORNcsfVL/w9sXfU3FdzkZ4AiGT/StOp9LHIsdKxlw+iLDlhVW1MIaui1RMyyiBSqaPsgoKcY2aLSu0V33V1Wp3zJIlOmQfh9LFe+fpXTG0rgU+nykyRls0Ao5w==',key_name='tempest-TestNetworkAdvancedServerOps-516914566',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:21:31Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='8a545a398e2e433bbe3f3dfa2ec4ebcb',ramdisk_id='',reservation_id='r-s2m74s4z',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-585003851',owner_user_name='tempest-TestNetworkAdvancedServerOps-585003851-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:21:58Z,user_data=None,user_id='db22b0e0f6594362af24484ba9b01936',uuid=36338d64-f0f0-468c-be11-8a124d76cb6e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.307 2 DEBUG nova.network.os_vif_util [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converting VIF {"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.175", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.307 2 DEBUG nova.network.os_vif_util [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.308 2 DEBUG os_vif [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.311 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap82ce1d9a-86, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.319 2 INFO os_vif [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:40:18:be,bridge_name='br-int',has_traffic_filtering=True,id=82ce1d9a-86a1-4412-a852-b1ee4b3139bf,network=Network(7a72b09b-88c3-4d24-8cee-ec65d8210d47),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap82ce1d9a-86')#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.320 2 INFO nova.virt.libvirt.driver [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Deleting instance files /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e_del#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.321 2 INFO nova.virt.libvirt.driver [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Deletion of /var/lib/nova/instances/36338d64-f0f0-468c-be11-8a124d76cb6e_del complete#033[00m
Oct  7 16:22:05 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[229069]: [NOTICE]   (229073) : haproxy version is 2.8.14-c23fe91
Oct  7 16:22:05 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[229069]: [NOTICE]   (229073) : path to executable is /usr/sbin/haproxy
Oct  7 16:22:05 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[229069]: [WARNING]  (229073) : Exiting Master process...
Oct  7 16:22:05 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[229069]: [ALERT]    (229073) : Current worker (229075) exited with code 143 (Terminated)
Oct  7 16:22:05 np0005474864 neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47[229069]: [WARNING]  (229073) : All workers exited. Exiting... (0)
Oct  7 16:22:05 np0005474864 systemd[1]: libpod-4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9.scope: Deactivated successfully.
Oct  7 16:22:05 np0005474864 podman[229130]: 2025-10-07 20:22:05.367769274 +0000 UTC m=+0.157296810 container died 4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.367 2 INFO nova.compute.manager [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.368 2 DEBUG oslo.service.loopingcall [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.368 2 DEBUG nova.compute.manager [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:22:05 np0005474864 nova_compute[192593]: 2025-10-07 20:22:05.368 2 DEBUG nova.network.neutron [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:22:05 np0005474864 systemd[1]: var-lib-containers-storage-overlay-67edd00aed64311ff294bb445707e332abc8ae728833e19fcecce1988ded5cda-merged.mount: Deactivated successfully.
Oct  7 16:22:05 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9-userdata-shm.mount: Deactivated successfully.
Oct  7 16:22:06 np0005474864 podman[229130]: 2025-10-07 20:22:06.051574164 +0000 UTC m=+0.841101700 container cleanup 4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  7 16:22:06 np0005474864 systemd[1]: libpod-conmon-4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9.scope: Deactivated successfully.
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.079 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.079 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.109 2 DEBUG nova.network.neutron [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.131 2 INFO nova.compute.manager [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Took 0.76 seconds to deallocate network for instance.#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.190 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.190 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.198 2 DEBUG nova.compute.manager [req-284344a9-05b8-495d-b00f-71013885cd42 req-47ad4c2b-a61d-43cf-b9c8-d63f667dbc59 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-deleted-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.257 2 DEBUG nova.compute.provider_tree [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.273 2 DEBUG nova.scheduler.client.report [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.294 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.104s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.316 2 INFO nova.scheduler.client.report [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Deleted allocations for instance 36338d64-f0f0-468c-be11-8a124d76cb6e#033[00m
Oct  7 16:22:06 np0005474864 podman[229179]: 2025-10-07 20:22:06.33796236 +0000 UTC m=+0.257268979 container remove 4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251001)
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.345 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[04d3f0ef-9460-411b-9a57-dee1db2da6e5]: (4, ('Tue Oct  7 08:22:05 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 (4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9)\n4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9\nTue Oct  7 08:22:06 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 (4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9)\n4ae445c16654d7830b35977b46590f1b6305f8fb53a96541b813cbe7a96973f9\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.347 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[69856377-8726-43fe-b349-69332cf8d56d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.348 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7a72b09b-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:22:06 np0005474864 kernel: tap7a72b09b-80: left promiscuous mode
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.360 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a6313093-004b-4608-88a6-3f79a2478e4d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.377 2 DEBUG oslo_concurrency.lockutils [None req-c3de3d32-9b7a-448e-bc4c-cff6d95d3c8f db22b0e0f6594362af24484ba9b01936 8a545a398e2e433bbe3f3dfa2ec4ebcb - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.353s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.392 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[127694c6-8fda-4295-a366-7150217ce2a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.394 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8e2b5646-5359-4253-af22-a256cf1ce224]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.412 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[77f78ce2-3037-47bc-8f4c-da31d467bc79]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 415274, 'reachable_time': 44244, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229195, 'error': None, 'target': 'ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:06 np0005474864 systemd[1]: run-netns-ovnmeta\x2d7a72b09b\x2d88c3\x2d4d24\x2d8cee\x2dec65d8210d47.mount: Deactivated successfully.
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.414 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7a72b09b-88c3-4d24-8cee-ec65d8210d47 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.415 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[573abd0d-0447-4933-bbdd-98f86d5dbb06]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:22:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:06.416 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 5 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.558 2 DEBUG nova.network.neutron [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updated VIF entry in instance network info cache for port 82ce1d9a-86a1-4412-a852-b1ee4b3139bf. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.559 2 DEBUG nova.network.neutron [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Updating instance_info_cache with network_info: [{"id": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "address": "fa:16:3e:40:18:be", "network": {"id": "7a72b09b-88c3-4d24-8cee-ec65d8210d47", "bridge": "br-int", "label": "tempest-network-smoke--766702828", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "8a545a398e2e433bbe3f3dfa2ec4ebcb", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap82ce1d9a-86", "ovs_interfaceid": "82ce1d9a-86a1-4412-a852-b1ee4b3139bf", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:22:06 np0005474864 nova_compute[192593]: 2025-10-07 20:22:06.581 2 DEBUG oslo_concurrency.lockutils [req-53b0447d-98e1-4ace-bb2c-dcd2fc9efddc req-dd12607f-b2cf-427e-9ae2-7fe14dc3db2b 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-36338d64-f0f0-468c-be11-8a124d76cb6e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.144 2 DEBUG nova.compute.manager [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-unplugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.145 2 DEBUG oslo_concurrency.lockutils [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.145 2 DEBUG oslo_concurrency.lockutils [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.145 2 DEBUG oslo_concurrency.lockutils [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.146 2 DEBUG nova.compute.manager [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] No waiting events found dispatching network-vif-unplugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.146 2 WARNING nova.compute.manager [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received unexpected event network-vif-unplugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.146 2 DEBUG nova.compute.manager [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.147 2 DEBUG oslo_concurrency.lockutils [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.147 2 DEBUG oslo_concurrency.lockutils [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.147 2 DEBUG oslo_concurrency.lockutils [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "36338d64-f0f0-468c-be11-8a124d76cb6e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.148 2 DEBUG nova.compute.manager [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] No waiting events found dispatching network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:22:07 np0005474864 nova_compute[192593]: 2025-10-07 20:22:07.148 2 WARNING nova.compute.manager [req-0b168c45-ab48-4fff-be7b-01ec6ef97b43 req-bf76990e-97c8-4630-a976-4fcc35ad6ac9 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Received unexpected event network-vif-plugged-82ce1d9a-86a1-4412-a852-b1ee4b3139bf for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:22:09 np0005474864 nova_compute[192593]: 2025-10-07 20:22:09.175 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:10 np0005474864 nova_compute[192593]: 2025-10-07 20:22:10.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:10 np0005474864 nova_compute[192593]: 2025-10-07 20:22:10.244 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:10 np0005474864 nova_compute[192593]: 2025-10-07 20:22:10.315 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:11 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:11.418 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:22:13 np0005474864 podman[229197]: 2025-10-07 20:22:13.402545307 +0000 UTC m=+0.088602262 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:22:13 np0005474864 podman[229198]: 2025-10-07 20:22:13.414787849 +0000 UTC m=+0.101793472 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Oct  7 16:22:14 np0005474864 nova_compute[192593]: 2025-10-07 20:22:14.177 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:15 np0005474864 nova_compute[192593]: 2025-10-07 20:22:15.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:16.198 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:16.200 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:22:16.201 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:17 np0005474864 podman[229245]: 2025-10-07 20:22:17.404710215 +0000 UTC m=+0.085590816 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  7 16:22:17 np0005474864 podman[229243]: 2025-10-07 20:22:17.407128934 +0000 UTC m=+0.100622408 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  7 16:22:17 np0005474864 podman[229244]: 2025-10-07 20:22:17.419868961 +0000 UTC m=+0.114124527 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:22:19 np0005474864 nova_compute[192593]: 2025-10-07 20:22:19.180 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:20 np0005474864 nova_compute[192593]: 2025-10-07 20:22:20.289 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868525.2885337, 36338d64-f0f0-468c-be11-8a124d76cb6e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:22:20 np0005474864 nova_compute[192593]: 2025-10-07 20:22:20.289 2 INFO nova.compute.manager [-] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:22:20 np0005474864 nova_compute[192593]: 2025-10-07 20:22:20.321 2 DEBUG nova.compute.manager [None req-7565a779-5269-4330-bd84-8e3532980c43 - - - - - -] [instance: 36338d64-f0f0-468c-be11-8a124d76cb6e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:22:20 np0005474864 nova_compute[192593]: 2025-10-07 20:22:20.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:23 np0005474864 podman[229303]: 2025-10-07 20:22:23.365385205 +0000 UTC m=+0.064462247 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:22:24 np0005474864 nova_compute[192593]: 2025-10-07 20:22:24.213 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:25 np0005474864 nova_compute[192593]: 2025-10-07 20:22:25.364 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.122 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.123 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.123 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.123 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:22:27 np0005474864 podman[229325]: 2025-10-07 20:22:27.243562113 +0000 UTC m=+0.087071298 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.293 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.295 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5703MB free_disk=73.46337890625GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.295 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.295 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.368 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.369 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.385 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.398 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.420 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:22:27 np0005474864 nova_compute[192593]: 2025-10-07 20:22:27.421 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:22:28 np0005474864 nova_compute[192593]: 2025-10-07 20:22:28.421 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:29 np0005474864 nova_compute[192593]: 2025-10-07 20:22:29.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:29 np0005474864 nova_compute[192593]: 2025-10-07 20:22:29.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:30 np0005474864 nova_compute[192593]: 2025-10-07 20:22:30.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:30 np0005474864 nova_compute[192593]: 2025-10-07 20:22:30.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:30 np0005474864 nova_compute[192593]: 2025-10-07 20:22:30.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.256 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.257 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:22:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:22:31 np0005474864 podman[229349]: 2025-10-07 20:22:31.39746242 +0000 UTC m=+0.090783165 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:22:33 np0005474864 nova_compute[192593]: 2025-10-07 20:22:33.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:34 np0005474864 nova_compute[192593]: 2025-10-07 20:22:34.095 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:34 np0005474864 nova_compute[192593]: 2025-10-07 20:22:34.095 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:22:34 np0005474864 nova_compute[192593]: 2025-10-07 20:22:34.096 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:22:34 np0005474864 nova_compute[192593]: 2025-10-07 20:22:34.116 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:22:34 np0005474864 nova_compute[192593]: 2025-10-07 20:22:34.218 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:35 np0005474864 nova_compute[192593]: 2025-10-07 20:22:35.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:35 np0005474864 nova_compute[192593]: 2025-10-07 20:22:35.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:22:35 np0005474864 nova_compute[192593]: 2025-10-07 20:22:35.454 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:37 np0005474864 nova_compute[192593]: 2025-10-07 20:22:37.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:22:39 np0005474864 nova_compute[192593]: 2025-10-07 20:22:39.220 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:40 np0005474864 nova_compute[192593]: 2025-10-07 20:22:40.458 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:44 np0005474864 nova_compute[192593]: 2025-10-07 20:22:44.223 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:44 np0005474864 podman[229370]: 2025-10-07 20:22:44.366742475 +0000 UTC m=+0.061739809 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:22:44 np0005474864 podman[229371]: 2025-10-07 20:22:44.41344083 +0000 UTC m=+0.093581496 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., name=ubi9-minimal)
Oct  7 16:22:45 np0005474864 nova_compute[192593]: 2025-10-07 20:22:45.462 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:48 np0005474864 podman[229416]: 2025-10-07 20:22:48.373433794 +0000 UTC m=+0.059209846 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  7 16:22:48 np0005474864 podman[229418]: 2025-10-07 20:22:48.405327093 +0000 UTC m=+0.077161003 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd)
Oct  7 16:22:48 np0005474864 podman[229417]: 2025-10-07 20:22:48.437479418 +0000 UTC m=+0.116615889 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:22:49 np0005474864 nova_compute[192593]: 2025-10-07 20:22:49.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:50 np0005474864 nova_compute[192593]: 2025-10-07 20:22:50.465 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:54 np0005474864 nova_compute[192593]: 2025-10-07 20:22:54.230 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:54 np0005474864 podman[229478]: 2025-10-07 20:22:54.355501801 +0000 UTC m=+0.053488741 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  7 16:22:55 np0005474864 nova_compute[192593]: 2025-10-07 20:22:55.469 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:22:57 np0005474864 podman[229497]: 2025-10-07 20:22:57.419382282 +0000 UTC m=+0.106354483 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:22:59 np0005474864 nova_compute[192593]: 2025-10-07 20:22:59.233 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:00 np0005474864 nova_compute[192593]: 2025-10-07 20:23:00.472 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:02 np0005474864 podman[229522]: 2025-10-07 20:23:02.395060649 +0000 UTC m=+0.079975824 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:23:03 np0005474864 ovn_controller[94801]: 2025-10-07T20:23:03Z|00261|memory_trim|INFO|Detected inactivity (last active 30002 ms ago): trimming memory
Oct  7 16:23:04 np0005474864 nova_compute[192593]: 2025-10-07 20:23:04.260 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:05 np0005474864 nova_compute[192593]: 2025-10-07 20:23:05.509 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:06 np0005474864 nova_compute[192593]: 2025-10-07 20:23:06.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:06.378 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:23:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:06.381 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:23:07 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:07.385 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:23:09 np0005474864 nova_compute[192593]: 2025-10-07 20:23:09.261 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:10 np0005474864 nova_compute[192593]: 2025-10-07 20:23:10.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:14 np0005474864 nova_compute[192593]: 2025-10-07 20:23:14.263 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:15 np0005474864 podman[229544]: 2025-10-07 20:23:15.391056765 +0000 UTC m=+0.084593627 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, version=9.6, name=ubi9-minimal, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, architecture=x86_64)
Oct  7 16:23:15 np0005474864 podman[229543]: 2025-10-07 20:23:15.424659172 +0000 UTC m=+0.114354593 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:23:15 np0005474864 nova_compute[192593]: 2025-10-07 20:23:15.518 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:16 np0005474864 nova_compute[192593]: 2025-10-07 20:23:16.015 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:16.199 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:16.200 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:16.200 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:19 np0005474864 nova_compute[192593]: 2025-10-07 20:23:19.267 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:19 np0005474864 podman[229590]: 2025-10-07 20:23:19.38621243 +0000 UTC m=+0.072979682 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Oct  7 16:23:19 np0005474864 podman[229588]: 2025-10-07 20:23:19.46261848 +0000 UTC m=+0.150822883 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=iscsid)
Oct  7 16:23:19 np0005474864 podman[229589]: 2025-10-07 20:23:19.480584328 +0000 UTC m=+0.159418452 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller)
Oct  7 16:23:20 np0005474864 nova_compute[192593]: 2025-10-07 20:23:20.580 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:24 np0005474864 nova_compute[192593]: 2025-10-07 20:23:24.269 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:25 np0005474864 podman[229650]: 2025-10-07 20:23:25.400864525 +0000 UTC m=+0.089297832 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:23:25 np0005474864 nova_compute[192593]: 2025-10-07 20:23:25.584 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.130 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.130 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.131 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.332 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.333 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5713MB free_disk=73.46347427368164GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.334 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.334 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:28 np0005474864 podman[229672]: 2025-10-07 20:23:28.368307149 +0000 UTC m=+0.060263577 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.480 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.480 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.557 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.574 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.575 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:23:28 np0005474864 nova_compute[192593]: 2025-10-07 20:23:28.575 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.241s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:29 np0005474864 nova_compute[192593]: 2025-10-07 20:23:29.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:29 np0005474864 nova_compute[192593]: 2025-10-07 20:23:29.575 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:30 np0005474864 nova_compute[192593]: 2025-10-07 20:23:30.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:30 np0005474864 nova_compute[192593]: 2025-10-07 20:23:30.587 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:31 np0005474864 nova_compute[192593]: 2025-10-07 20:23:31.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:32 np0005474864 nova_compute[192593]: 2025-10-07 20:23:32.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:33 np0005474864 nova_compute[192593]: 2025-10-07 20:23:33.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:33 np0005474864 podman[229696]: 2025-10-07 20:23:33.381312623 +0000 UTC m=+0.078651815 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  7 16:23:34 np0005474864 nova_compute[192593]: 2025-10-07 20:23:34.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:34 np0005474864 nova_compute[192593]: 2025-10-07 20:23:34.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:23:34 np0005474864 nova_compute[192593]: 2025-10-07 20:23:34.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:23:34 np0005474864 nova_compute[192593]: 2025-10-07 20:23:34.115 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:23:34 np0005474864 nova_compute[192593]: 2025-10-07 20:23:34.116 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:34 np0005474864 nova_compute[192593]: 2025-10-07 20:23:34.116 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  7 16:23:34 np0005474864 nova_compute[192593]: 2025-10-07 20:23:34.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:35 np0005474864 nova_compute[192593]: 2025-10-07 20:23:35.116 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:35 np0005474864 nova_compute[192593]: 2025-10-07 20:23:35.627 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:37 np0005474864 nova_compute[192593]: 2025-10-07 20:23:37.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:37 np0005474864 nova_compute[192593]: 2025-10-07 20:23:37.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:37 np0005474864 nova_compute[192593]: 2025-10-07 20:23:37.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:23:38 np0005474864 nova_compute[192593]: 2025-10-07 20:23:38.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:38 np0005474864 nova_compute[192593]: 2025-10-07 20:23:38.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  7 16:23:38 np0005474864 nova_compute[192593]: 2025-10-07 20:23:38.120 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  7 16:23:39 np0005474864 nova_compute[192593]: 2025-10-07 20:23:39.275 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:40 np0005474864 nova_compute[192593]: 2025-10-07 20:23:40.630 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:40.705 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:06:86 10.100.0.2 2001:db8::f816:3eff:fe8d:686'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8d:686/64', 'neutron:device_id': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d5b554d-fe47-46fd-9f7a-274db91b3c84, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a2d4218c-8d67-4147-bfd2-daf44815e38b) old=Port_Binding(mac=['fa:16:3e:8d:06:86 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:23:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:40.707 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a2d4218c-8d67-4147-bfd2-daf44815e38b in datapath c77dfc09-a940-4330-b50f-d7b09c70d5c0 updated#033[00m
Oct  7 16:23:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:40.708 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c77dfc09-a940-4330-b50f-d7b09c70d5c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:23:40 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:40.709 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[efa2698d-7535-4511-a77c-6b04300bc8bb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:44 np0005474864 nova_compute[192593]: 2025-10-07 20:23:44.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:44.735 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:06:86 10.100.0.2 2001:db8:0:1:f816:3eff:fe8d:686 2001:db8::f816:3eff:fe8d:686'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe8d:686/64 2001:db8::f816:3eff:fe8d:686/64', 'neutron:device_id': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d5b554d-fe47-46fd-9f7a-274db91b3c84, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=a2d4218c-8d67-4147-bfd2-daf44815e38b) old=Port_Binding(mac=['fa:16:3e:8d:06:86 10.100.0.2 2001:db8::f816:3eff:fe8d:686'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe8d:686/64', 'neutron:device_id': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:23:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:44.737 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port a2d4218c-8d67-4147-bfd2-daf44815e38b in datapath c77dfc09-a940-4330-b50f-d7b09c70d5c0 updated#033[00m
Oct  7 16:23:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:44.739 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c77dfc09-a940-4330-b50f-d7b09c70d5c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:23:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:44.740 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4df8f44f-0b9e-4ae1-b6a2-66ba5abf2a75]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:45 np0005474864 nova_compute[192593]: 2025-10-07 20:23:45.680 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:46 np0005474864 podman[229718]: 2025-10-07 20:23:46.399950977 +0000 UTC m=+0.078174449 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, release=1755695350, container_name=openstack_network_exporter, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:23:46 np0005474864 podman[229717]: 2025-10-07 20:23:46.405508537 +0000 UTC m=+0.085835940 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:23:48 np0005474864 nova_compute[192593]: 2025-10-07 20:23:48.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.116 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.117 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.133 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.217 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.218 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.224 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.224 2 INFO nova.compute.claims [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.482 2 DEBUG nova.compute.provider_tree [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.498 2 DEBUG nova.scheduler.client.report [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.531 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.313s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.532 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.588 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.588 2 DEBUG nova.network.neutron [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.615 2 INFO nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.640 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.727 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.729 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.730 2 INFO nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Creating image(s)#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.731 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "/var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.731 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "/var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.732 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "/var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.756 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.819 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.821 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.822 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.833 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.886 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.887 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.921 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.921 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:49 np0005474864 nova_compute[192593]: 2025-10-07 20:23:49.922 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.010 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.011 2 DEBUG nova.virt.disk.api [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Checking if we can resize image /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.012 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.096 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk --force-share --output=json" returned: 0 in 0.085s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.098 2 DEBUG nova.virt.disk.api [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Cannot resize image /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.099 2 DEBUG nova.objects.instance [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lazy-loading 'migration_context' on Instance uuid 16e9367c-3ee4-4605-bcb0-bc881516215c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.119 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.119 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Ensure instance console log exists: /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.120 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.121 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.121 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:50 np0005474864 podman[229781]: 2025-10-07 20:23:50.3973048 +0000 UTC m=+0.087375744 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  7 16:23:50 np0005474864 podman[229783]: 2025-10-07 20:23:50.434356976 +0000 UTC m=+0.113306440 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Oct  7 16:23:50 np0005474864 podman[229782]: 2025-10-07 20:23:50.434311155 +0000 UTC m=+0.120723914 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.682 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:50 np0005474864 nova_compute[192593]: 2025-10-07 20:23:50.801 2 DEBUG nova.network.neutron [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Successfully created port: f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.571 2 DEBUG nova.network.neutron [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Successfully updated port: f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.591 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "refresh_cache-16e9367c-3ee4-4605-bcb0-bc881516215c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.591 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquired lock "refresh_cache-16e9367c-3ee4-4605-bcb0-bc881516215c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.592 2 DEBUG nova.network.neutron [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.718 2 DEBUG nova.compute.manager [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-changed-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.719 2 DEBUG nova.compute.manager [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Refreshing instance network info cache due to event network-changed-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.720 2 DEBUG oslo_concurrency.lockutils [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-16e9367c-3ee4-4605-bcb0-bc881516215c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:23:51 np0005474864 nova_compute[192593]: 2025-10-07 20:23:51.801 2 DEBUG nova.network.neutron [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.789 2 DEBUG nova.network.neutron [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Updating instance_info_cache with network_info: [{"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.807 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Releasing lock "refresh_cache-16e9367c-3ee4-4605-bcb0-bc881516215c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.808 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Instance network_info: |[{"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.808 2 DEBUG oslo_concurrency.lockutils [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-16e9367c-3ee4-4605-bcb0-bc881516215c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.808 2 DEBUG nova.network.neutron [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Refreshing network info cache for port f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.811 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Start _get_guest_xml network_info=[{"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.817 2 WARNING nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.822 2 DEBUG nova.virt.libvirt.host [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.822 2 DEBUG nova.virt.libvirt.host [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.825 2 DEBUG nova.virt.libvirt.host [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.826 2 DEBUG nova.virt.libvirt.host [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.827 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.827 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.828 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.828 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.828 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.828 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.829 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.829 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.829 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.829 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.830 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.830 2 DEBUG nova.virt.hardware [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.833 2 DEBUG nova.virt.libvirt.vif [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:23:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1379723732',display_name='tempest-TestServerMultinode-server-1379723732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1379723732',id=52,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4bb838573ec845c1a1e779d97ada653c',ramdisk_id='',reservation_id='r-9hjmlnmk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-471967560',owner_user_name='tempest-TestServerMultinode-471967560-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:23:49Z,user_data=None,user_id='81802ec6d167452692d5e5475be8e6ba',uuid=16e9367c-3ee4-4605-bcb0-bc881516215c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.834 2 DEBUG nova.network.os_vif_util [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Converting VIF {"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.835 2 DEBUG nova.network.os_vif_util [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:de:95,bridge_name='br-int',has_traffic_filtering=True,id=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235,network=Network(3ded2375-cc27-4f2e-bd40-5d10893e1420),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65ac9cd-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.835 2 DEBUG nova.objects.instance [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lazy-loading 'pci_devices' on Instance uuid 16e9367c-3ee4-4605-bcb0-bc881516215c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.858 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <uuid>16e9367c-3ee4-4605-bcb0-bc881516215c</uuid>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <name>instance-00000034</name>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestServerMultinode-server-1379723732</nova:name>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:23:53</nova:creationTime>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:user uuid="81802ec6d167452692d5e5475be8e6ba">tempest-TestServerMultinode-471967560-project-admin</nova:user>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:project uuid="4bb838573ec845c1a1e779d97ada653c">tempest-TestServerMultinode-471967560</nova:project>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        <nova:port uuid="f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <entry name="serial">16e9367c-3ee4-4605-bcb0-bc881516215c</entry>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <entry name="uuid">16e9367c-3ee4-4605-bcb0-bc881516215c</entry>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk.config"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:79:de:95"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <target dev="tapf65ac9cd-66"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/console.log" append="off"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:23:53 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:23:53 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:23:53 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:23:53 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.860 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Preparing to wait for external event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.861 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.861 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.861 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.862 2 DEBUG nova.virt.libvirt.vif [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:23:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1379723732',display_name='tempest-TestServerMultinode-server-1379723732',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1379723732',id=52,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='4bb838573ec845c1a1e779d97ada653c',ramdisk_id='',reservation_id='r-9hjmlnmk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-471967560',owner_user_name='tempest-TestServerMultinode-471967560-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:23:49Z,user_data=None,user_id='81802ec6d167452692d5e5475be8e6ba',uuid=16e9367c-3ee4-4605-bcb0-bc881516215c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.862 2 DEBUG nova.network.os_vif_util [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Converting VIF {"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.863 2 DEBUG nova.network.os_vif_util [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:de:95,bridge_name='br-int',has_traffic_filtering=True,id=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235,network=Network(3ded2375-cc27-4f2e-bd40-5d10893e1420),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65ac9cd-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.863 2 DEBUG os_vif [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:de:95,bridge_name='br-int',has_traffic_filtering=True,id=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235,network=Network(3ded2375-cc27-4f2e-bd40-5d10893e1420),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65ac9cd-66') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.864 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.864 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.868 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.868 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf65ac9cd-66, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.869 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf65ac9cd-66, col_values=(('external_ids', {'iface-id': 'f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:79:de:95', 'vm-uuid': '16e9367c-3ee4-4605-bcb0-bc881516215c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.870 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:53 np0005474864 NetworkManager[51631]: <info>  [1759868633.8718] manager: (tapf65ac9cd-66): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.873 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.880 2 INFO os_vif [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:de:95,bridge_name='br-int',has_traffic_filtering=True,id=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235,network=Network(3ded2375-cc27-4f2e-bd40-5d10893e1420),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65ac9cd-66')#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.924 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.925 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.925 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] No VIF found with MAC fa:16:3e:79:de:95, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:23:53 np0005474864 nova_compute[192593]: 2025-10-07 20:23:53.925 2 INFO nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Using config drive#033[00m
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.515 2 INFO nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Creating config drive at /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk.config#033[00m
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.525 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvwbj9qj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.657 2 DEBUG oslo_concurrency.processutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpuvwbj9qj" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:23:54 np0005474864 kernel: tapf65ac9cd-66: entered promiscuous mode
Oct  7 16:23:54 np0005474864 NetworkManager[51631]: <info>  [1759868634.7397] manager: (tapf65ac9cd-66): new Tun device (/org/freedesktop/NetworkManager/Devices/139)
Oct  7 16:23:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:23:54Z|00262|binding|INFO|Claiming lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 for this chassis.
Oct  7 16:23:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:23:54Z|00263|binding|INFO|f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235: Claiming fa:16:3e:79:de:95 10.100.0.7
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.741 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.747 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.751 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.777 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:de:95 10.100.0.7'], port_security=['fa:16:3e:79:de:95 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '16e9367c-3ee4-4605-bcb0-bc881516215c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4bb838573ec845c1a1e779d97ada653c', 'neutron:revision_number': '2', 'neutron:security_group_ids': '8fe8549c-5285-4ed4-82d3-024d96acca3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a7e1768-94ad-4cd7-a4c9-b120fee2b0f1, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.778 103685 INFO neutron.agent.ovn.metadata.agent [-] Port f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 in datapath 3ded2375-cc27-4f2e-bd40-5d10893e1420 bound to our chassis#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.779 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 3ded2375-cc27-4f2e-bd40-5d10893e1420#033[00m
Oct  7 16:23:54 np0005474864 systemd-machined[152586]: New machine qemu-18-instance-00000034.
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.797 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[86c8ba47-eb75-438b-b5f6-61f578f2b732]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.798 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap3ded2375-c1 in ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.802 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap3ded2375-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.802 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[dbc89494-7a9c-4cf5-9c19-9cff0421d552]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.803 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc68665-b553-4e8d-9ef4-7bc5b5f99799]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.819 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.820 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[2ee4baa2-f486-4d41-b7ca-dfb82c060958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 systemd[1]: Started Virtual Machine qemu-18-instance-00000034.
Oct  7 16:23:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:23:54Z|00264|binding|INFO|Setting lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 ovn-installed in OVS
Oct  7 16:23:54 np0005474864 ovn_controller[94801]: 2025-10-07T20:23:54Z|00265|binding|INFO|Setting lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 up in Southbound
Oct  7 16:23:54 np0005474864 nova_compute[192593]: 2025-10-07 20:23:54.825 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.835 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4579c53d-578f-420f-8655-275faaa4f137]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 systemd-udevd[229869]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:23:54 np0005474864 NetworkManager[51631]: <info>  [1759868634.8548] device (tapf65ac9cd-66): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:23:54 np0005474864 NetworkManager[51631]: <info>  [1759868634.8557] device (tapf65ac9cd-66): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.872 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[1ba17aa6-8fd0-4ae4-9277-d0e984d2b766]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 systemd-udevd[229873]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.877 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[024724a1-baf8-4c93-8b14-8c3b7c3524c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 NetworkManager[51631]: <info>  [1759868634.8780] manager: (tap3ded2375-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.906 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[d9856f85-2e49-4912-ae8b-8409b45ba0bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.910 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[32b1c625-71df-4bcd-b03c-4bedf3cf8b8c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 NetworkManager[51631]: <info>  [1759868634.9366] device (tap3ded2375-c0): carrier: link connected
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.944 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[bc8936de-588a-4dd1-85f7-7e3bab30492b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.969 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[085f05ea-c2a8-4057-9ff8-7f060e60da71]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ded2375-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:34:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427082, 'reachable_time': 24695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 229899, 'error': None, 'target': 'ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:54 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:54.983 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[c9c11673-9875-4458-ae4e-155e9831cddd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe1f:34db'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 427082, 'tstamp': 427082}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 229900, 'error': None, 'target': 'ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.011 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[67285703-a00b-435d-8cc5-6c96833f12c4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap3ded2375-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:1f:34:db'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 80], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427082, 'reachable_time': 24695, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 229901, 'error': None, 'target': 'ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.051 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[84f1e4a6-0ca9-4826-84e2-55ee4c2da3d7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.147 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[272b99d4-c412-4674-a704-aecb6c829d41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.149 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ded2375-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.149 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.150 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap3ded2375-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:23:55 np0005474864 NetworkManager[51631]: <info>  [1759868635.1539] manager: (tap3ded2375-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:55 np0005474864 kernel: tap3ded2375-c0: entered promiscuous mode
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.160 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap3ded2375-c0, col_values=(('external_ids', {'iface-id': '4d0c2d13-6523-49c5-b37f-de556fa2d853'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.159 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:55 np0005474864 ovn_controller[94801]: 2025-10-07T20:23:55Z|00266|binding|INFO|Releasing lport 4d0c2d13-6523-49c5-b37f-de556fa2d853 from this chassis (sb_readonly=0)
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.162 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.166 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.170 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/3ded2375-cc27-4f2e-bd40-5d10893e1420.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/3ded2375-cc27-4f2e-bd40-5d10893e1420.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.171 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[761c2672-dea7-47d4-b794-82e9b98c8fc5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.172 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-3ded2375-cc27-4f2e-bd40-5d10893e1420
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/3ded2375-cc27-4f2e-bd40-5d10893e1420.pid.haproxy
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID 3ded2375-cc27-4f2e-bd40-5d10893e1420
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:23:55 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:23:55.173 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'env', 'PROCESS_TAG=haproxy-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/3ded2375-cc27-4f2e-bd40-5d10893e1420.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.194 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.607 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868635.6065733, 16e9367c-3ee4-4605-bcb0-bc881516215c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.610 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] VM Started (Lifecycle Event)#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.636 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.642 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868635.608469, 16e9367c-3ee4-4605-bcb0-bc881516215c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.642 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:23:55 np0005474864 podman[229940]: 2025-10-07 20:23:55.556361931 +0000 UTC m=+0.039033464 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.661 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.670 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:23:55 np0005474864 nova_compute[192593]: 2025-10-07 20:23:55.699 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:23:55 np0005474864 podman[229940]: 2025-10-07 20:23:55.744593865 +0000 UTC m=+0.227265398 container create 6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:23:55 np0005474864 systemd[1]: Started libpod-conmon-6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b.scope.
Oct  7 16:23:55 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:23:55 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f5836d54d49dd12cecf49e517f809dae9f1607dd9300fe7300fb2da3b846841d/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:23:55 np0005474864 podman[229940]: 2025-10-07 20:23:55.833855932 +0000 UTC m=+0.316527465 container init 6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:23:55 np0005474864 podman[229940]: 2025-10-07 20:23:55.839290098 +0000 UTC m=+0.321961611 container start 6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:23:55 np0005474864 neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420[229956]: [NOTICE]   (229972) : New worker (229980) forked
Oct  7 16:23:55 np0005474864 neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420[229956]: [NOTICE]   (229972) : Loading success.
Oct  7 16:23:55 np0005474864 podman[229953]: 2025-10-07 20:23:55.880005059 +0000 UTC m=+0.086044456 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.830 2 DEBUG nova.compute.manager [req-8b6f12b0-ac4a-47f1-baab-b74f9e0fa089 req-d5b451d8-40d6-40c3-ae0b-63a713d2c6fb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.832 2 DEBUG oslo_concurrency.lockutils [req-8b6f12b0-ac4a-47f1-baab-b74f9e0fa089 req-d5b451d8-40d6-40c3-ae0b-63a713d2c6fb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.833 2 DEBUG oslo_concurrency.lockutils [req-8b6f12b0-ac4a-47f1-baab-b74f9e0fa089 req-d5b451d8-40d6-40c3-ae0b-63a713d2c6fb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.833 2 DEBUG oslo_concurrency.lockutils [req-8b6f12b0-ac4a-47f1-baab-b74f9e0fa089 req-d5b451d8-40d6-40c3-ae0b-63a713d2c6fb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.834 2 DEBUG nova.compute.manager [req-8b6f12b0-ac4a-47f1-baab-b74f9e0fa089 req-d5b451d8-40d6-40c3-ae0b-63a713d2c6fb 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Processing event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.835 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.841 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868637.8407562, 16e9367c-3ee4-4605-bcb0-bc881516215c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.841 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.843 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.846 2 INFO nova.virt.libvirt.driver [-] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Instance spawned successfully.#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.846 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.873 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.881 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.885 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.886 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.886 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.887 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.887 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.888 2 DEBUG nova.virt.libvirt.driver [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.935 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.969 2 INFO nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Took 8.24 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:23:57 np0005474864 nova_compute[192593]: 2025-10-07 20:23:57.970 2 DEBUG nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:23:58 np0005474864 nova_compute[192593]: 2025-10-07 20:23:58.036 2 INFO nova.compute.manager [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Took 8.85 seconds to build instance.#033[00m
Oct  7 16:23:58 np0005474864 nova_compute[192593]: 2025-10-07 20:23:58.055 2 DEBUG oslo_concurrency.lockutils [None req-aac4e783-7d35-4b28-bb90-e7e43d3a4c96 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.938s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:23:58 np0005474864 nova_compute[192593]: 2025-10-07 20:23:58.199 2 DEBUG nova.network.neutron [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Updated VIF entry in instance network info cache for port f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:23:58 np0005474864 nova_compute[192593]: 2025-10-07 20:23:58.200 2 DEBUG nova.network.neutron [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Updating instance_info_cache with network_info: [{"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:23:58 np0005474864 nova_compute[192593]: 2025-10-07 20:23:58.220 2 DEBUG oslo_concurrency.lockutils [req-86a06ba8-d1f2-44f9-9415-a12e3766a2d2 req-9f85b147-e519-4727-a3cd-75ec7bafc84a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-16e9367c-3ee4-4605-bcb0-bc881516215c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:23:58 np0005474864 nova_compute[192593]: 2025-10-07 20:23:58.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:59 np0005474864 nova_compute[192593]: 2025-10-07 20:23:59.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:23:59 np0005474864 podman[229989]: 2025-10-07 20:23:59.389092898 +0000 UTC m=+0.081474654 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:24:00 np0005474864 nova_compute[192593]: 2025-10-07 20:24:00.069 2 DEBUG nova.compute.manager [req-49917390-931f-4c4e-9370-ad4beeab423c req-85e9c237-d1c2-43ea-8736-29e9b8c625a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:00 np0005474864 nova_compute[192593]: 2025-10-07 20:24:00.069 2 DEBUG oslo_concurrency.lockutils [req-49917390-931f-4c4e-9370-ad4beeab423c req-85e9c237-d1c2-43ea-8736-29e9b8c625a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:00 np0005474864 nova_compute[192593]: 2025-10-07 20:24:00.070 2 DEBUG oslo_concurrency.lockutils [req-49917390-931f-4c4e-9370-ad4beeab423c req-85e9c237-d1c2-43ea-8736-29e9b8c625a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:00 np0005474864 nova_compute[192593]: 2025-10-07 20:24:00.070 2 DEBUG oslo_concurrency.lockutils [req-49917390-931f-4c4e-9370-ad4beeab423c req-85e9c237-d1c2-43ea-8736-29e9b8c625a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:00 np0005474864 nova_compute[192593]: 2025-10-07 20:24:00.070 2 DEBUG nova.compute.manager [req-49917390-931f-4c4e-9370-ad4beeab423c req-85e9c237-d1c2-43ea-8736-29e9b8c625a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] No waiting events found dispatching network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:24:00 np0005474864 nova_compute[192593]: 2025-10-07 20:24:00.070 2 WARNING nova.compute.manager [req-49917390-931f-4c4e-9370-ad4beeab423c req-85e9c237-d1c2-43ea-8736-29e9b8c625a3 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received unexpected event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.057 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.058 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.059 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.059 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.060 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.062 2 INFO nova.compute.manager [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Terminating instance#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.064 2 DEBUG nova.compute.manager [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:24:01 np0005474864 kernel: tapf65ac9cd-66 (unregistering): left promiscuous mode
Oct  7 16:24:01 np0005474864 NetworkManager[51631]: <info>  [1759868641.0931] device (tapf65ac9cd-66): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00267|binding|INFO|Releasing lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 from this chassis (sb_readonly=0)
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00268|binding|INFO|Setting lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 down in Southbound
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00269|binding|INFO|Removing iface tapf65ac9cd-66 ovn-installed in OVS
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.112 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.120 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:de:95 10.100.0.7'], port_security=['fa:16:3e:79:de:95 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '16e9367c-3ee4-4605-bcb0-bc881516215c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4bb838573ec845c1a1e779d97ada653c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8fe8549c-5285-4ed4-82d3-024d96acca3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a7e1768-94ad-4cd7-a4c9-b120fee2b0f1, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.123 103685 INFO neutron.agent.ovn.metadata.agent [-] Port f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 in datapath 3ded2375-cc27-4f2e-bd40-5d10893e1420 unbound from our chassis#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.126 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ded2375-cc27-4f2e-bd40-5d10893e1420, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.128 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[724551b9-bf63-4efd-a4ad-897d8fadc0a9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.129 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420 namespace which is not needed anymore#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.137 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000034.scope: Deactivated successfully.
Oct  7 16:24:01 np0005474864 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000034.scope: Consumed 3.998s CPU time.
Oct  7 16:24:01 np0005474864 systemd-machined[152586]: Machine qemu-18-instance-00000034 terminated.
Oct  7 16:24:01 np0005474864 neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420[229956]: [NOTICE]   (229972) : haproxy version is 2.8.14-c23fe91
Oct  7 16:24:01 np0005474864 neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420[229956]: [NOTICE]   (229972) : path to executable is /usr/sbin/haproxy
Oct  7 16:24:01 np0005474864 neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420[229956]: [WARNING]  (229972) : Exiting Master process...
Oct  7 16:24:01 np0005474864 neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420[229956]: [ALERT]    (229972) : Current worker (229980) exited with code 143 (Terminated)
Oct  7 16:24:01 np0005474864 neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420[229956]: [WARNING]  (229972) : All workers exited. Exiting... (0)
Oct  7 16:24:01 np0005474864 systemd[1]: libpod-6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b.scope: Deactivated successfully.
Oct  7 16:24:01 np0005474864 conmon[229956]: conmon 6421f867ed3194afcfd2 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b.scope/container/memory.events
Oct  7 16:24:01 np0005474864 podman[230038]: 2025-10-07 20:24:01.267116639 +0000 UTC m=+0.047501337 container died 6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:24:01 np0005474864 kernel: tapf65ac9cd-66: entered promiscuous mode
Oct  7 16:24:01 np0005474864 NetworkManager[51631]: <info>  [1759868641.2855] manager: (tapf65ac9cd-66): new Tun device (/org/freedesktop/NetworkManager/Devices/142)
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.287 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00270|binding|INFO|Claiming lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 for this chassis.
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00271|binding|INFO|f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235: Claiming fa:16:3e:79:de:95 10.100.0.7
Oct  7 16:24:01 np0005474864 kernel: tapf65ac9cd-66 (unregistering): left promiscuous mode
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.298 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:de:95 10.100.0.7'], port_security=['fa:16:3e:79:de:95 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '16e9367c-3ee4-4605-bcb0-bc881516215c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4bb838573ec845c1a1e779d97ada653c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8fe8549c-5285-4ed4-82d3-024d96acca3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a7e1768-94ad-4cd7-a4c9-b120fee2b0f1, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:24:01 np0005474864 systemd[1]: var-lib-containers-storage-overlay-f5836d54d49dd12cecf49e517f809dae9f1607dd9300fe7300fb2da3b846841d-merged.mount: Deactivated successfully.
Oct  7 16:24:01 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b-userdata-shm.mount: Deactivated successfully.
Oct  7 16:24:01 np0005474864 podman[230038]: 2025-10-07 20:24:01.321106322 +0000 UTC m=+0.101491020 container cleanup 6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00272|binding|INFO|Setting lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 ovn-installed in OVS
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00273|binding|INFO|Setting lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 up in Southbound
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00274|binding|INFO|Releasing lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 from this chassis (sb_readonly=1)
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00275|if_status|INFO|Not setting lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 down as sb is readonly
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.324 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00276|binding|INFO|Removing iface tapf65ac9cd-66 ovn-installed in OVS
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00277|binding|INFO|Releasing lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 from this chassis (sb_readonly=0)
Oct  7 16:24:01 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:01Z|00278|binding|INFO|Setting lport f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 down in Southbound
Oct  7 16:24:01 np0005474864 systemd[1]: libpod-conmon-6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b.scope: Deactivated successfully.
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.336 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.339 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:de:95 10.100.0.7'], port_security=['fa:16:3e:79:de:95 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '16e9367c-3ee4-4605-bcb0-bc881516215c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '4bb838573ec845c1a1e779d97ada653c', 'neutron:revision_number': '4', 'neutron:security_group_ids': '8fe8549c-5285-4ed4-82d3-024d96acca3d', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=6a7e1768-94ad-4cd7-a4c9-b120fee2b0f1, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.355 2 INFO nova.virt.libvirt.driver [-] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Instance destroyed successfully.#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.355 2 DEBUG nova.objects.instance [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lazy-loading 'resources' on Instance uuid 16e9367c-3ee4-4605-bcb0-bc881516215c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.369 2 DEBUG nova.virt.libvirt.vif [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:23:48Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-1379723732',display_name='tempest-TestServerMultinode-server-1379723732',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-1379723732',id=52,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:23:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='4bb838573ec845c1a1e779d97ada653c',ramdisk_id='',reservation_id='r-9hjmlnmk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='admin,reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-471967560',owner_user_name='tempest-TestServerMultinode-471967560-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:23:58Z,user_data=None,user_id='81802ec6d167452692d5e5475be8e6ba',uuid=16e9367c-3ee4-4605-bcb0-bc881516215c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.369 2 DEBUG nova.network.os_vif_util [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Converting VIF {"id": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "address": "fa:16:3e:79:de:95", "network": {"id": "3ded2375-cc27-4f2e-bd40-5d10893e1420", "bridge": "br-int", "label": "tempest-TestServerMultinode-293113049-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "525c3eed395940779f653322e2fd6d0e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf65ac9cd-66", "ovs_interfaceid": "f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.370 2 DEBUG nova.network.os_vif_util [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:de:95,bridge_name='br-int',has_traffic_filtering=True,id=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235,network=Network(3ded2375-cc27-4f2e-bd40-5d10893e1420),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65ac9cd-66') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.370 2 DEBUG os_vif [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:de:95,bridge_name='br-int',has_traffic_filtering=True,id=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235,network=Network(3ded2375-cc27-4f2e-bd40-5d10893e1420),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65ac9cd-66') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.372 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf65ac9cd-66, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.377 2 INFO os_vif [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:de:95,bridge_name='br-int',has_traffic_filtering=True,id=f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235,network=Network(3ded2375-cc27-4f2e-bd40-5d10893e1420),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf65ac9cd-66')#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.378 2 INFO nova.virt.libvirt.driver [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Deleting instance files /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c_del#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.378 2 INFO nova.virt.libvirt.driver [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Deletion of /var/lib/nova/instances/16e9367c-3ee4-4605-bcb0-bc881516215c_del complete#033[00m
Oct  7 16:24:01 np0005474864 podman[230080]: 2025-10-07 20:24:01.393556506 +0000 UTC m=+0.045794028 container remove 6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.402 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[67c2773f-c4ab-43d7-b3d1-dc8b6ca425c8]: (4, ('Tue Oct  7 08:24:01 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420 (6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b)\n6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b\nTue Oct  7 08:24:01 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420 (6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b)\n6421f867ed3194afcfd254fea96ee527a15f13e59cc415c66e82882f6e714f1b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.404 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[204396b6-ba06-4db4-90b8-5c09485d5437]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.405 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap3ded2375-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.406 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 kernel: tap3ded2375-c0: left promiscuous mode
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.422 2 INFO nova.compute.manager [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.423 2 DEBUG oslo.service.loopingcall [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.425 2 DEBUG nova.compute.manager [-] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:24:01 np0005474864 nova_compute[192593]: 2025-10-07 20:24:01.425 2 DEBUG nova.network.neutron [-] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.426 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[03c57334-bfe8-458e-82f2-2945ff7bb959]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.458 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8ffd1bf0-0794-446f-ae0e-9bd0ee58c209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.460 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[bee573ad-1826-45cc-82ff-1dc98355a0c5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.488 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[03b0a00c-5c09-4228-a4be-dc89f4f625d9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 427075, 'reachable_time': 23732, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230099, 'error': None, 'target': 'ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 systemd[1]: run-netns-ovnmeta\x2d3ded2375\x2dcc27\x2d4f2e\x2dbd40\x2d5d10893e1420.mount: Deactivated successfully.
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.492 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-3ded2375-cc27-4f2e-bd40-5d10893e1420 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.493 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[e4737892-4b8d-4f0f-b7a2-586dfbb42732]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.495 103685 INFO neutron.agent.ovn.metadata.agent [-] Port f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 in datapath 3ded2375-cc27-4f2e-bd40-5d10893e1420 unbound from our chassis#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.498 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ded2375-cc27-4f2e-bd40-5d10893e1420, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.499 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[97b2e395-2444-4a3a-b5bf-2b42198218fe]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.500 103685 INFO neutron.agent.ovn.metadata.agent [-] Port f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 in datapath 3ded2375-cc27-4f2e-bd40-5d10893e1420 unbound from our chassis#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.502 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 3ded2375-cc27-4f2e-bd40-5d10893e1420, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:24:01 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:01.503 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[7fe4da63-bd6e-4937-8bcf-c0042c3c49e2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.161 2 DEBUG nova.compute.manager [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-vif-unplugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.162 2 DEBUG oslo_concurrency.lockutils [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.162 2 DEBUG oslo_concurrency.lockutils [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.163 2 DEBUG oslo_concurrency.lockutils [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.163 2 DEBUG nova.compute.manager [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] No waiting events found dispatching network-vif-unplugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.163 2 DEBUG nova.compute.manager [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-vif-unplugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.164 2 DEBUG nova.compute.manager [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.164 2 DEBUG oslo_concurrency.lockutils [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.164 2 DEBUG oslo_concurrency.lockutils [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.164 2 DEBUG oslo_concurrency.lockutils [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.165 2 DEBUG nova.compute.manager [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] No waiting events found dispatching network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.165 2 WARNING nova.compute.manager [req-ca43824d-8e13-4894-8441-86a56cf1eb8c req-16f2eb05-3298-46d5-a428-374d13e1bd24 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received unexpected event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.276 2 DEBUG nova.network.neutron [-] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.298 2 INFO nova.compute.manager [-] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Took 0.87 seconds to deallocate network for instance.#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.394 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.395 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.475 2 DEBUG nova.compute.provider_tree [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.489 2 DEBUG nova.scheduler.client.report [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.511 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.116s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.535 2 INFO nova.scheduler.client.report [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Deleted allocations for instance 16e9367c-3ee4-4605-bcb0-bc881516215c#033[00m
Oct  7 16:24:02 np0005474864 nova_compute[192593]: 2025-10-07 20:24:02.591 2 DEBUG oslo_concurrency.lockutils [None req-e7b6876e-92b3-4fe2-9edb-4fcd6eb81e86 81802ec6d167452692d5e5475be8e6ba 4bb838573ec845c1a1e779d97ada653c - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.533s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.282 2 DEBUG nova.compute.manager [req-148d6a43-955b-410a-9743-177520bfb580 req-918daf7d-96f5-45ab-8fb4-a7fe01289290 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.283 2 DEBUG oslo_concurrency.lockutils [req-148d6a43-955b-410a-9743-177520bfb580 req-918daf7d-96f5-45ab-8fb4-a7fe01289290 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.283 2 DEBUG oslo_concurrency.lockutils [req-148d6a43-955b-410a-9743-177520bfb580 req-918daf7d-96f5-45ab-8fb4-a7fe01289290 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.284 2 DEBUG oslo_concurrency.lockutils [req-148d6a43-955b-410a-9743-177520bfb580 req-918daf7d-96f5-45ab-8fb4-a7fe01289290 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "16e9367c-3ee4-4605-bcb0-bc881516215c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.284 2 DEBUG nova.compute.manager [req-148d6a43-955b-410a-9743-177520bfb580 req-918daf7d-96f5-45ab-8fb4-a7fe01289290 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] No waiting events found dispatching network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.285 2 WARNING nova.compute.manager [req-148d6a43-955b-410a-9743-177520bfb580 req-918daf7d-96f5-45ab-8fb4-a7fe01289290 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received unexpected event network-vif-plugged-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.285 2 DEBUG nova.compute.manager [req-148d6a43-955b-410a-9743-177520bfb580 req-918daf7d-96f5-45ab-8fb4-a7fe01289290 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Received event network-vif-deleted-f65ac9cd-66c7-4aff-81bc-4c6bf4cd7235 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:04 np0005474864 nova_compute[192593]: 2025-10-07 20:24:04.317 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:04 np0005474864 podman[230100]: 2025-10-07 20:24:04.387039422 +0000 UTC m=+0.077141630 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, tcib_managed=true, container_name=ceilometer_agent_compute)
Oct  7 16:24:06 np0005474864 nova_compute[192593]: 2025-10-07 20:24:06.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:06 np0005474864 nova_compute[192593]: 2025-10-07 20:24:06.552 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:06.554 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:24:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:06.555 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:24:06 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:06.557 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:09 np0005474864 nova_compute[192593]: 2025-10-07 20:24:09.319 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:11 np0005474864 nova_compute[192593]: 2025-10-07 20:24:11.377 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:14 np0005474864 nova_compute[192593]: 2025-10-07 20:24:14.322 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:16 np0005474864 nova_compute[192593]: 2025-10-07 20:24:16.121 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:16.200 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:16.201 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:16.201 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:16 np0005474864 nova_compute[192593]: 2025-10-07 20:24:16.353 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868641.3517346, 16e9367c-3ee4-4605-bcb0-bc881516215c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:24:16 np0005474864 nova_compute[192593]: 2025-10-07 20:24:16.354 2 INFO nova.compute.manager [-] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:24:16 np0005474864 nova_compute[192593]: 2025-10-07 20:24:16.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:16 np0005474864 nova_compute[192593]: 2025-10-07 20:24:16.389 2 DEBUG nova.compute.manager [None req-a5148d0c-3b62-4cef-a5cd-2831b2cc8b53 - - - - - -] [instance: 16e9367c-3ee4-4605-bcb0-bc881516215c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:24:17 np0005474864 podman[230122]: 2025-10-07 20:24:17.379613131 +0000 UTC m=+0.068264464 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:24:17 np0005474864 podman[230123]: 2025-10-07 20:24:17.385072978 +0000 UTC m=+0.060925133 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., version=9.6, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, container_name=openstack_network_exporter, io.openshift.expose-services=)
Oct  7 16:24:19 np0005474864 nova_compute[192593]: 2025-10-07 20:24:19.380 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:21 np0005474864 podman[230167]: 2025-10-07 20:24:21.361723866 +0000 UTC m=+0.055138637 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:24:21 np0005474864 podman[230169]: 2025-10-07 20:24:21.36845123 +0000 UTC m=+0.053627274 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS)
Oct  7 16:24:21 np0005474864 nova_compute[192593]: 2025-10-07 20:24:21.382 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:21 np0005474864 podman[230168]: 2025-10-07 20:24:21.397045792 +0000 UTC m=+0.085499290 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  7 16:24:24 np0005474864 nova_compute[192593]: 2025-10-07 20:24:24.383 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.652 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.653 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.667 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.765 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.765 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.773 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.774 2 INFO nova.compute.claims [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.977 2 DEBUG nova.compute.provider_tree [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:24:25 np0005474864 nova_compute[192593]: 2025-10-07 20:24:25.992 2 DEBUG nova.scheduler.client.report [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.020 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.021 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.089 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.089 2 DEBUG nova.network.neutron [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.117 2 INFO nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.155 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.259 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.261 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.262 2 INFO nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Creating image(s)#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.263 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "/var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.264 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.265 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.290 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:24:26 np0005474864 podman[230231]: 2025-10-07 20:24:26.364287845 +0000 UTC m=+0.054600962 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.367 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.369 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.370 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.392 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.413 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.461 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.462 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.488 2 DEBUG nova.policy [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.679 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk 1073741824" returned: 0 in 0.217s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.680 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.310s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.681 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.731 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.732 2 DEBUG nova.virt.disk.api [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Checking if we can resize image /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.733 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.821 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.824 2 DEBUG nova.virt.disk.api [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Cannot resize image /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.825 2 DEBUG nova.objects.instance [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'migration_context' on Instance uuid 07f750b6-5548-4357-b8c0-426ee842fd13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.851 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.853 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Ensure instance console log exists: /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.854 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.855 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:26 np0005474864 nova_compute[192593]: 2025-10-07 20:24:26.856 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:27 np0005474864 nova_compute[192593]: 2025-10-07 20:24:27.358 2 DEBUG nova.network.neutron [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Successfully created port: 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.220 2 DEBUG nova.network.neutron [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Successfully updated port: 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.249 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.250 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquired lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.250 2 DEBUG nova.network.neutron [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.360 2 DEBUG nova.compute.manager [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-changed-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.361 2 DEBUG nova.compute.manager [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Refreshing instance network info cache due to event network-changed-6ed165c8-9f03-473d-9ce0-008ebbb2ad82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.361 2 DEBUG oslo_concurrency.lockutils [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:24:28 np0005474864 nova_compute[192593]: 2025-10-07 20:24:28.453 2 DEBUG nova.network.neutron [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:24:29 np0005474864 nova_compute[192593]: 2025-10-07 20:24:29.105 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:29 np0005474864 nova_compute[192593]: 2025-10-07 20:24:29.385 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.057 2 DEBUG nova.network.neutron [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updating instance_info_cache with network_info: [{"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.101 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Releasing lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.101 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Instance network_info: |[{"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.102 2 DEBUG oslo_concurrency.lockutils [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.103 2 DEBUG nova.network.neutron [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Refreshing network info cache for port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.108 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Start _get_guest_xml network_info=[{"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.118 2 WARNING nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.130 2 DEBUG nova.virt.libvirt.host [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.131 2 DEBUG nova.virt.libvirt.host [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.151 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.152 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.152 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.153 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.156 2 DEBUG nova.virt.libvirt.host [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.157 2 DEBUG nova.virt.libvirt.host [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.158 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.159 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.160 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.160 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.161 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.161 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.161 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.162 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.162 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.163 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.163 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.164 2 DEBUG nova.virt.hardware [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.170 2 DEBUG nova.virt.libvirt.vif [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:24:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1047191865',display_name='tempest-TestGettingAddress-server-1047191865',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1047191865',id=54,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODApuobVWheYyTm3OBTjSn4TO/V3tPPQJs6pnABxyGhko3e1WXcNE+wSqvg7lzxnTDu6x2KD1f3a91mdVG/0EiHUoMJ2bm+0slGalNli+RQ8BMC263wwvs8kCY7EtokcQ==',key_name='tempest-TestGettingAddress-891638733',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7503jqbq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:24:26Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=07f750b6-5548-4357-b8c0-426ee842fd13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.171 2 DEBUG nova.network.os_vif_util [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.172 2 DEBUG nova.network.os_vif_util [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:9e:d9,bridge_name='br-int',has_traffic_filtering=True,id=6ed165c8-9f03-473d-9ce0-008ebbb2ad82,network=Network(c77dfc09-a940-4330-b50f-d7b09c70d5c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ed165c8-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.174 2 DEBUG nova.objects.instance [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 07f750b6-5548-4357-b8c0-426ee842fd13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.205 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <uuid>07f750b6-5548-4357-b8c0-426ee842fd13</uuid>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <name>instance-00000036</name>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestGettingAddress-server-1047191865</nova:name>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:24:30</nova:creationTime>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:user uuid="334f092941fc46c496c7def76b2cfe18">tempest-TestGettingAddress-626136673-project-member</nova:user>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:project uuid="2f9bf744045540618c9980fd4a7694f5">tempest-TestGettingAddress-626136673</nova:project>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        <nova:port uuid="6ed165c8-9f03-473d-9ce0-008ebbb2ad82">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:feff:9ed9" ipVersion="6"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:feff:9ed9" ipVersion="6"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <entry name="serial">07f750b6-5548-4357-b8c0-426ee842fd13</entry>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <entry name="uuid">07f750b6-5548-4357-b8c0-426ee842fd13</entry>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk.config"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:ff:9e:d9"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <target dev="tap6ed165c8-9f"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/console.log" append="off"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:24:30 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:24:30 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:24:30 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:24:30 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.209 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Preparing to wait for external event network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.209 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.210 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.210 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.211 2 DEBUG nova.virt.libvirt.vif [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:24:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1047191865',display_name='tempest-TestGettingAddress-server-1047191865',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1047191865',id=54,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODApuobVWheYyTm3OBTjSn4TO/V3tPPQJs6pnABxyGhko3e1WXcNE+wSqvg7lzxnTDu6x2KD1f3a91mdVG/0EiHUoMJ2bm+0slGalNli+RQ8BMC263wwvs8kCY7EtokcQ==',key_name='tempest-TestGettingAddress-891638733',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7503jqbq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:24:26Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=07f750b6-5548-4357-b8c0-426ee842fd13,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.212 2 DEBUG nova.network.os_vif_util [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.213 2 DEBUG nova.network.os_vif_util [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ff:9e:d9,bridge_name='br-int',has_traffic_filtering=True,id=6ed165c8-9f03-473d-9ce0-008ebbb2ad82,network=Network(c77dfc09-a940-4330-b50f-d7b09c70d5c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ed165c8-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.214 2 DEBUG os_vif [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:9e:d9,bridge_name='br-int',has_traffic_filtering=True,id=6ed165c8-9f03-473d-9ce0-008ebbb2ad82,network=Network(c77dfc09-a940-4330-b50f-d7b09c70d5c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ed165c8-9f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.215 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.216 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.216 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.221 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.221 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6ed165c8-9f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.222 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6ed165c8-9f, col_values=(('external_ids', {'iface-id': '6ed165c8-9f03-473d-9ce0-008ebbb2ad82', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ff:9e:d9', 'vm-uuid': '07f750b6-5548-4357-b8c0-426ee842fd13'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:30 np0005474864 NetworkManager[51631]: <info>  [1759868670.2273] manager: (tap6ed165c8-9f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/143)
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.227 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.232 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.234 2 INFO os_vif [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ff:9e:d9,bridge_name='br-int',has_traffic_filtering=True,id=6ed165c8-9f03-473d-9ce0-008ebbb2ad82,network=Network(c77dfc09-a940-4330-b50f-d7b09c70d5c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ed165c8-9f')#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.299 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.300 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.300 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:ff:9e:d9, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:24:30 np0005474864 podman[230265]: 2025-10-07 20:24:30.301202269 +0000 UTC m=+0.076599314 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.301 2 INFO nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Using config drive#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.462 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.465 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5712MB free_disk=73.4552230834961GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.465 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.466 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.568 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 07f750b6-5548-4357-b8c0-426ee842fd13 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.569 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.570 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.641 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.661 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.690 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.690 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.225s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.864 2 INFO nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Creating config drive at /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk.config#033[00m
Oct  7 16:24:30 np0005474864 nova_compute[192593]: 2025-10-07 20:24:30.869 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdmzt_lcd execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.002 2 DEBUG oslo_concurrency.processutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpdmzt_lcd" returned: 0 in 0.132s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:24:31 np0005474864 kernel: tap6ed165c8-9f: entered promiscuous mode
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.1084] manager: (tap6ed165c8-9f): new Tun device (/org/freedesktop/NetworkManager/Devices/144)
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.109 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:31Z|00279|binding|INFO|Claiming lport 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 for this chassis.
Oct  7 16:24:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:31Z|00280|binding|INFO|6ed165c8-9f03-473d-9ce0-008ebbb2ad82: Claiming fa:16:3e:ff:9e:d9 10.100.0.6 2001:db8:0:1:f816:3eff:feff:9ed9 2001:db8::f816:3eff:feff:9ed9
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.118 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.123 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.1253] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.1270] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.132 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:9e:d9 10.100.0.6 2001:db8:0:1:f816:3eff:feff:9ed9 2001:db8::f816:3eff:feff:9ed9'], port_security=['fa:16:3e:ff:9e:d9 10.100.0.6 2001:db8:0:1:f816:3eff:feff:9ed9 2001:db8::f816:3eff:feff:9ed9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8:0:1:f816:3eff:feff:9ed9/64 2001:db8::f816:3eff:feff:9ed9/64', 'neutron:device_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fbea62e8-afa6-48f6-8f8e-1b935b8ad5da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d5b554d-fe47-46fd-9f7a-274db91b3c84, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=6ed165c8-9f03-473d-9ce0-008ebbb2ad82) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.134 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 in datapath c77dfc09-a940-4330-b50f-d7b09c70d5c0 bound to our chassis#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.135 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c77dfc09-a940-4330-b50f-d7b09c70d5c0#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.153 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[a15e26d9-ba7a-411b-8f75-5ee7db9a2e17]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.154 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc77dfc09-a1 in ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.157 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc77dfc09-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.157 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d889455e-44f4-4655-84c1-c32041531945]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.158 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6392d83f-a907-4112-90a1-83de240cbc1c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 systemd-udevd[230305]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.172 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[952c8fab-62ac-49ba-91e4-4ab25a6f0cc2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.1799] device (tap6ed165c8-9f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.1809] device (tap6ed165c8-9f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:24:31 np0005474864 systemd-machined[152586]: New machine qemu-19-instance-00000036.
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.209 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[1f99e3f2-539b-4925-91ab-5782b30709fa]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 systemd[1]: Started Virtual Machine qemu-19-instance-00000036.
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.258 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[799084fa-a46b-4cb9-81f3-384e50b71fd4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:31Z|00281|binding|INFO|Setting lport 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 up in Southbound
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.280 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.2853] manager: (tapc77dfc09-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/147)
Oct  7 16:24:31 np0005474864 systemd-udevd[230312]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.284 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[030e92de-4e2a-47d3-93c0-e65aecce5d5e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:31Z|00282|binding|INFO|Setting lport 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 ovn-installed in OVS
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.289 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:31.320 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'name': 'tempest-TestGettingAddress-server-1047191865', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'paused', 'tenant_id': '2f9bf744045540618c9980fd4a7694f5', 'user_id': '334f092941fc46c496c7def76b2cfe18', 'hostId': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'status': 'paused', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  7 16:24:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:31.321 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.321 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[402831ae-91ae-44f6-a6b1-75f27a409052]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.325 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[01816cec-13ec-40b4-a059-a092a57f242c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.3533] device (tapc77dfc09-a0): carrier: link connected
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.359 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[60744116-08f4-4c44-aac2-e7182277e4e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.379 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d4ff3088-0974-406b-88a1-561c9a5c9391]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc77dfc09-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:06:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430724, 'reachable_time': 16569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230340, 'error': None, 'target': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.397 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[03f205f5-f6c1-41d5-bb47-3d38b07ff7ae]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe8d:686'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 430724, 'tstamp': 430724}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230341, 'error': None, 'target': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.413 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[186174ce-d3f5-4f75-b3d6-f1d7e9433692]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc77dfc09-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:8d:06:86'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 83], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430724, 'reachable_time': 16569, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230342, 'error': None, 'target': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.446 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[45495239-cb8b-4ed5-85da-14ba4aafbedd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.509 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4342a5a0-8f55-4f6c-9377-4f2115fe6764]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.520 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc77dfc09-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.521 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.521 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc77dfc09-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.524 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 NetworkManager[51631]: <info>  [1759868671.5245] manager: (tapc77dfc09-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Oct  7 16:24:31 np0005474864 kernel: tapc77dfc09-a0: entered promiscuous mode
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.529 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc77dfc09-a0, col_values=(('external_ids', {'iface-id': 'a2d4218c-8d67-4147-bfd2-daf44815e38b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:31Z|00283|binding|INFO|Releasing lport a2d4218c-8d67-4147-bfd2-daf44815e38b from this chassis (sb_readonly=0)
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.531 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.532 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c77dfc09-a940-4330-b50f-d7b09c70d5c0.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c77dfc09-a940-4330-b50f-d7b09c70d5c0.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.533 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[af311bf6-98a7-4be2-af52-87883ee80a25]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.534 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-c77dfc09-a940-4330-b50f-d7b09c70d5c0
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/c77dfc09-a940-4330-b50f-d7b09c70d5c0.pid.haproxy
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID c77dfc09-a940-4330-b50f-d7b09c70d5c0
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:24:31 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:31.536 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'env', 'PROCESS_TAG=haproxy-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c77dfc09-a940-4330-b50f-d7b09c70d5c0.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:24:31 np0005474864 nova_compute[192593]: 2025-10-07 20:24:31.542 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:32 np0005474864 podman[230381]: 2025-10-07 20:24:32.039977034 +0000 UTC m=+0.077766328 container create ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3)
Oct  7 16:24:32 np0005474864 systemd[1]: Started libpod-conmon-ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6.scope.
Oct  7 16:24:32 np0005474864 podman[230381]: 2025-10-07 20:24:31.994726032 +0000 UTC m=+0.032515416 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.099 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868672.0989914, 07f750b6-5548-4357-b8c0-426ee842fd13 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.101 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] VM Started (Lifecycle Event)#033[00m
Oct  7 16:24:32 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.116 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.117 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c10a52d-6657-474a-ac58-730714a1de53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:31.321908', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a208ae0a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '0eb215e6939597c2c93d497f93588848f366591b051c1bb4f1503fe56c71831d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:31.321908', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a208bd14-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '27bf7d08c3df2189020927136f9a095cf2a48c01a1b634b7bbb2c818e84a11d3'}]}, 'timestamp': '2025-10-07 20:24:32.118068', '_unique_id': '091c82df5e0c4d259c03ff553ed506a7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/580fac30dbde053e3d10bd03d98320f5b4a1878cc8af323a5ac32548f35a756b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.119 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.120 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.120 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>]
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.120 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.130 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.131 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4478acad-1149-4921-bac5-467474e755e8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.120547', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a20abe7a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.070371347, 'message_signature': 'e0527e898d96998b3df696265e95e866bd8bf92ef2d9b74610f64eb87d6e382c'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.120547', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a20ac8a2-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.070371347, 'message_signature': 'ef3d3df661ca6975b8c86baa154ca78bb91e3a95c04c4a48c80d142fbb0e0e07'}]}, 'timestamp': '2025-10-07 20:24:32.131409', '_unique_id': '91702bf82b0b4d4bbb72aad1de5d45a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.132 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.133 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.134 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 07f750b6-5548-4357-b8c0-426ee842fd13 / tap6ed165c8-9f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.135 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fff9ac22-a587-4d05-8959-73a2218c5a3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.132753', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a20b6aa0-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': '1c0158af94eea3787305d70e7f75e890d820eee84c604c00ade2367750fbf4bd'}]}, 'timestamp': '2025-10-07 20:24:32.135561', '_unique_id': '58c0223910a8418f82c0bb22508cbb32'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.136 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.137 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.137 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.137 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.read.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '92f1e179-0041-4057-bba4-bb6935846e0f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.137229', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a20bb654-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '17e2bbb1e6cc7373d3ea239efdccf33ae7df638d9378893dccfae81663c4cc34'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.137229', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a20bbde8-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '2ff67f73587d79984462bc9ab4f5fe28bd4a1f0a88efb6ec2f2d0d688cfd031f'}]}, 'timestamp': '2025-10-07 20:24:32.137665', '_unique_id': '0ac1f6ed94a9451181bb6dbc79280af8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.138 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>]
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b357382-fa9e-4dd7-b591-958a506580f3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.139125', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a20bfef2-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': '77534c890cc298af2f29af5891ba06785aba2a70796cc2c4d808d05e056425fb'}]}, 'timestamp': '2025-10-07 20:24:32.139365', '_unique_id': '57e1d5f1867342b8a02ebbef4220551c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.139 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.140 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7afabb80-5079-4740-b8ae-5601e897f92c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.140408', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a20c3142-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': '04933cb88103d858c4420b2e7de0272dc977330897abfa89988cb4eeb37bd3af'}]}, 'timestamp': '2025-10-07 20:24:32.140630', '_unique_id': 'a5585d4d1f7f4170af51cba38742b00c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.141 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868672.099204, 07f750b6-5548-4357-b8c0-426ee842fd13 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.142 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.141 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 podman[230381]: 2025-10-07 20:24:32.142891844 +0000 UTC m=+0.180681148 container init ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e937648e-dd1c-43a5-9599-11afee60a299', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.141856', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a20c6996-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.070371347, 'message_signature': '9d22ec3f2469fd3b06fc782f9b4cc5dafd238e91a15f167d46ad46a0e0aa15ef'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.141856', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a20c7102-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.070371347, 'message_signature': '527383f15714ce98322b0272f7b201436195b9cbc1107349534ed7df299e5286'}]}, 'timestamp': '2025-10-07 20:24:32.142264', '_unique_id': 'e28a7f895fb040a0a3a2b12f3ae59743'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.142 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.143 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.143 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9dbbeb5d-fd96-4546-a562-ba6a92aea872', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.143429', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a20ca6f4-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '6adb0718f99d978829cd0f45cb775ce0a6cfdc029d176c6b2247f4562a675105'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.143429', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a20cae42-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '89651626e831aa652d738f13775b6e8ca8220e00f21815ebd966f8dbb9b57e13'}]}, 'timestamp': '2025-10-07 20:24:32.143816', '_unique_id': '58f72fea98b14bbab4db59cdf280855a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.144 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  7 16:24:32 np0005474864 podman[230381]: 2025-10-07 20:24:32.151767289 +0000 UTC m=+0.189556593 container start ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.164 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/cpu volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87af2ec1-cfa7-47ae-bfd7-66e669f96450', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'timestamp': '2025-10-07T20:24:32.144885', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a20fe256-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.11428528, 'message_signature': '3bc1959d5f08198aa76df3a29a8d2937acc51dff9d4fe04514660b55028594d9'}]}, 'timestamp': '2025-10-07 20:24:32.164831', '_unique_id': '152f8bcc391a44ddbb7ed4a99a4c3944'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.165 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.read.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2654eab3-6a14-4cfe-aa2c-225987cb81a5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.165938', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a210165e-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '80e3dc85ace96313535ea8b8b36c2b3734f5565ebea403d2a513bb9380456ca7'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.165938', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a2101e88-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '96017b1498df3e5a1591c8b00e8dae72c82b00bf109001838a1f3a6e226577c4'}]}, 'timestamp': '2025-10-07 20:24:32.166356', '_unique_id': '1790af65f3d94c9fa5cb9238e5fc0a21'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.166 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.167 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.167 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f63a5104-b534-4e85-8b78-329502b25011', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.167450', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a21051a0-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': '7d12b11ee471e35d8f6640d8715faa960c46a220b5409466338f0598b59b9ebd'}]}, 'timestamp': '2025-10-07 20:24:32.167675', '_unique_id': '7036faa12aa74b9188359bf463abac1b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>]
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.168 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.169 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8890e3e4-7ce6-495c-b736-c0cf88b1f0de', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.169021', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a2108ea4-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': '0ca42fe097110dd5608318c97e45cc8a48da01f99e0ebe6891722756b956116a'}]}, 'timestamp': '2025-10-07 20:24:32.169237', '_unique_id': 'e02292f47c3746f9a8a452b71ebd3026'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.169 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.170 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.170 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.170 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 07f750b6-5548-4357-b8c0-426ee842fd13: ceilometer.compute.pollsters.NoVolumeException
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.170 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.170 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.incoming.bytes volume: 110 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c0b565fe-1c54-43b9-a672-35d96b0c61eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 110, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.170517', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a210c90a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': '83aa13c0045dd05006bcabde695cc32466a372fb462008312729fecd638cea08'}]}, 'timestamp': '2025-10-07 20:24:32.170731', '_unique_id': 'ea5a2c30cc2d41c8bfcef0b924dd2dbd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.171 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.read.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59e6d5de-8f01-48e8-9dd1-b4557d0c3cb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.171763', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a210f9b6-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': 'e3a3e6a9cffd4a7c5681122f59968d2bb840e8e36df831da7be55d27d3a7976e'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.171763', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a2110118-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '4294312e7ef7a0a5c77e40bf748cefd2355108788552e2943b5097f01f3f019d'}]}, 'timestamp': '2025-10-07 20:24:32.172151', '_unique_id': '6e4571ddcebf4372a2d50fb590d35282'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.172 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ee079c6c-95f0-484e-b84b-4889f5f4d7da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.173208', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a211334a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': '07f900b69ee131f9376ff430b65f62cc8d3b36a170942ad8107f859f52fbcfa6'}]}, 'timestamp': '2025-10-07 20:24:32.173451', '_unique_id': 'cadfd5a826214f56aeb7aba941ab511f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.174 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.173 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.174 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.174 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7742a00c-335c-40f8-b787-35d4d8cbb046', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.174553', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a211670c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': 'cce807815e9f6f1729dda3c35b1058fb7292060c9ac9aee818cf1eb09f1e9a27'}]}, 'timestamp': '2025-10-07 20:24:32.174777', '_unique_id': '889f201cd3c8448da792bff02d946151'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.175 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1047191865>]
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5593008e-97d7-4218-aeb0-846112c02e42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.176080', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a211a26c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.070371347, 'message_signature': '4afc72b36c130c8fc0561e6814708d5eeceafda50b501ae70c378cb48aff890c'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.176080', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a211ab54-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.070371347, 'message_signature': '1f2457709d6a26e1471a1963b5f5cbfdcff7970964b437bb6d7785165bc8ee5b'}]}, 'timestamp': '2025-10-07 20:24:32.176529', '_unique_id': '2d18bb4ca88847aa9433ed417dc7aa52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.176 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.177 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.177 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '77a5d508-42e5-4798-ad53-c86315ccc7eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.177781', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a211e524-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': 'aeec50abeceae5da87a124f8387028802282ace6219d0201df2f5189b5c7628f'}]}, 'timestamp': '2025-10-07 20:24:32.178045', '_unique_id': 'b0092ae8fcdc407dabaf51ec82227210'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.178 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '03bca544-dad9-40b7-8e6e-aacb788fb495', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-vda', 'timestamp': '2025-10-07T20:24:32.179097', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a2121846-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '63d678d0dd903fe2831f8d616cbe2119fc192c56a24ea4c2f4deebc06d087b0f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '07f750b6-5548-4357-b8c0-426ee842fd13-sda', 'timestamp': '2025-10-07T20:24:32.179097', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'instance-00000036', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a212208e-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4307.271729425, 'message_signature': '585360c43e88d93386b4a2afc1f5970f12f3222aab6023b7bab2a2c999601e7a'}]}, 'timestamp': '2025-10-07 20:24:32.179512', '_unique_id': '6431caadd782423f958732f15507be7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.179 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.180 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.180 12 DEBUG ceilometer.compute.pollsters [-] 07f750b6-5548-4357-b8c0-426ee842fd13/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:24:32 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [NOTICE]   (230400) : New worker (230402) forked
Oct  7 16:24:32 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [NOTICE]   (230400) : Loading success.
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0f9b8d98-ec06-4f25-8521-e7d0350aac3a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000036-07f750b6-5548-4357-b8c0-426ee842fd13-tap6ed165c8-9f', 'timestamp': '2025-10-07T20:24:32.180575', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1047191865', 'name': 'tap6ed165c8-9f', 'instance_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'paused', 'state': 'paused', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:ff:9e:d9', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6ed165c8-9f'}, 'message_id': 'a2125234-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4308.082561478, 'message_signature': 'e5f3965fb2e2e14cb373e1eb3f90dd27624bdcf60f7fd5f7fa569f379208ed4b'}]}, 'timestamp': '2025-10-07 20:24:32.180798', '_unique_id': '67203de1e6df4574874d83347085bb91'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:24:32 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:24:32.181 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.195 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.232 2 DEBUG nova.network.neutron [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updated VIF entry in instance network info cache for port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.232 2 DEBUG nova.network.neutron [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updating instance_info_cache with network_info: [{"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.251 2 DEBUG oslo_concurrency.lockutils [req-5e5a5b21-c1e3-45cd-8346-743330e61392 req-3615ae18-4fed-4090-9b97-b90e812a9531 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.687 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:32 np0005474864 nova_compute[192593]: 2025-10-07 20:24:32.687 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.046 2 DEBUG nova.compute.manager [req-44fb1bfb-06fb-4356-a9d9-92c725abafd4 req-90e2d324-0cc9-4cee-96f6-fa4dda6d737a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.047 2 DEBUG oslo_concurrency.lockutils [req-44fb1bfb-06fb-4356-a9d9-92c725abafd4 req-90e2d324-0cc9-4cee-96f6-fa4dda6d737a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.047 2 DEBUG oslo_concurrency.lockutils [req-44fb1bfb-06fb-4356-a9d9-92c725abafd4 req-90e2d324-0cc9-4cee-96f6-fa4dda6d737a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.048 2 DEBUG oslo_concurrency.lockutils [req-44fb1bfb-06fb-4356-a9d9-92c725abafd4 req-90e2d324-0cc9-4cee-96f6-fa4dda6d737a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.048 2 DEBUG nova.compute.manager [req-44fb1bfb-06fb-4356-a9d9-92c725abafd4 req-90e2d324-0cc9-4cee-96f6-fa4dda6d737a 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Processing event network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.049 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.054 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868673.0543349, 07f750b6-5548-4357-b8c0-426ee842fd13 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.055 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.058 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.062 2 INFO nova.virt.libvirt.driver [-] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Instance spawned successfully.#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.062 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.078 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.088 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.093 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.094 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.094 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.095 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.097 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.097 2 DEBUG nova.virt.libvirt.driver [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.112 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.165 2 INFO nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Took 6.91 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.166 2 DEBUG nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.254 2 INFO nova.compute.manager [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Took 7.53 seconds to build instance.#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.270 2 DEBUG oslo_concurrency.lockutils [None req-2844da67-5cba-44d8-a107-4975efa07201 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.617s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:33 np0005474864 nova_compute[192593]: 2025-10-07 20:24:33.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.383 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.384 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquired lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.384 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.384 2 DEBUG nova.objects.instance [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 07f750b6-5548-4357-b8c0-426ee842fd13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:24:34 np0005474864 nova_compute[192593]: 2025-10-07 20:24:34.387 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:35 np0005474864 nova_compute[192593]: 2025-10-07 20:24:35.220 2 DEBUG nova.compute.manager [req-fed1940d-694b-4557-bc06-ea8989bd96a1 req-e859103a-1dc6-480c-9dc6-27d4ba1c0097 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:35 np0005474864 nova_compute[192593]: 2025-10-07 20:24:35.220 2 DEBUG oslo_concurrency.lockutils [req-fed1940d-694b-4557-bc06-ea8989bd96a1 req-e859103a-1dc6-480c-9dc6-27d4ba1c0097 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:35 np0005474864 nova_compute[192593]: 2025-10-07 20:24:35.221 2 DEBUG oslo_concurrency.lockutils [req-fed1940d-694b-4557-bc06-ea8989bd96a1 req-e859103a-1dc6-480c-9dc6-27d4ba1c0097 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:35 np0005474864 nova_compute[192593]: 2025-10-07 20:24:35.221 2 DEBUG oslo_concurrency.lockutils [req-fed1940d-694b-4557-bc06-ea8989bd96a1 req-e859103a-1dc6-480c-9dc6-27d4ba1c0097 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:35 np0005474864 nova_compute[192593]: 2025-10-07 20:24:35.221 2 DEBUG nova.compute.manager [req-fed1940d-694b-4557-bc06-ea8989bd96a1 req-e859103a-1dc6-480c-9dc6-27d4ba1c0097 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] No waiting events found dispatching network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:24:35 np0005474864 nova_compute[192593]: 2025-10-07 20:24:35.221 2 WARNING nova.compute.manager [req-fed1940d-694b-4557-bc06-ea8989bd96a1 req-e859103a-1dc6-480c-9dc6-27d4ba1c0097 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received unexpected event network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 for instance with vm_state active and task_state None.#033[00m
Oct  7 16:24:35 np0005474864 nova_compute[192593]: 2025-10-07 20:24:35.225 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:35 np0005474864 podman[230411]: 2025-10-07 20:24:35.412350829 +0000 UTC m=+0.094670604 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251001)
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.244 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updating instance_info_cache with network_info: [{"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.277 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Releasing lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.277 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.278 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.279 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.279 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.279 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.398 2 DEBUG nova.compute.manager [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-changed-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.398 2 DEBUG nova.compute.manager [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Refreshing instance network info cache due to event network-changed-6ed165c8-9f03-473d-9ce0-008ebbb2ad82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.399 2 DEBUG oslo_concurrency.lockutils [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.399 2 DEBUG oslo_concurrency.lockutils [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.400 2 DEBUG nova.network.neutron [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Refreshing network info cache for port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:24:39 np0005474864 nova_compute[192593]: 2025-10-07 20:24:39.428 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:40 np0005474864 nova_compute[192593]: 2025-10-07 20:24:40.228 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:43 np0005474864 nova_compute[192593]: 2025-10-07 20:24:43.828 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:44 np0005474864 nova_compute[192593]: 2025-10-07 20:24:44.273 2 DEBUG nova.network.neutron [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updated VIF entry in instance network info cache for port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:24:44 np0005474864 nova_compute[192593]: 2025-10-07 20:24:44.275 2 DEBUG nova.network.neutron [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updating instance_info_cache with network_info: [{"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:24:44 np0005474864 nova_compute[192593]: 2025-10-07 20:24:44.296 2 DEBUG oslo_concurrency.lockutils [req-65c5a643-71f5-4f6a-8fab-42f3b737d808 req-d00da0ee-3f5a-4cce-be42-4900da4ddacf 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:24:44 np0005474864 nova_compute[192593]: 2025-10-07 20:24:44.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:44Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ff:9e:d9 10.100.0.6
Oct  7 16:24:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:44Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ff:9e:d9 10.100.0.6
Oct  7 16:24:45 np0005474864 nova_compute[192593]: 2025-10-07 20:24:45.231 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:48 np0005474864 nova_compute[192593]: 2025-10-07 20:24:48.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:48 np0005474864 podman[230443]: 2025-10-07 20:24:48.397844925 +0000 UTC m=+0.082241166 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:24:48 np0005474864 podman[230444]: 2025-10-07 20:24:48.421511716 +0000 UTC m=+0.095623372 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.buildah.version=1.33.7, config_id=edpm, vcs-type=git, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, io.openshift.tags=minimal rhel9, version=9.6, build-date=2025-08-20T13:12:41)
Oct  7 16:24:49 np0005474864 nova_compute[192593]: 2025-10-07 20:24:49.496 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:50 np0005474864 nova_compute[192593]: 2025-10-07 20:24:50.234 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:52 np0005474864 podman[230489]: 2025-10-07 20:24:52.404080224 +0000 UTC m=+0.088161927 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:24:52 np0005474864 podman[230491]: 2025-10-07 20:24:52.430059832 +0000 UTC m=+0.096626641 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible)
Oct  7 16:24:52 np0005474864 podman[230490]: 2025-10-07 20:24:52.458878931 +0000 UTC m=+0.133832301 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Oct  7 16:24:54 np0005474864 nova_compute[192593]: 2025-10-07 20:24:54.499 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:55 np0005474864 nova_compute[192593]: 2025-10-07 20:24:55.237 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 podman[230553]: 2025-10-07 20:24:57.400802126 +0000 UTC m=+0.086696395 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.441 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.442 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.442 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.443 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.443 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.445 2 INFO nova.compute.manager [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Terminating instance#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.446 2 DEBUG nova.compute.manager [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:24:57 np0005474864 kernel: tap6ed165c8-9f (unregistering): left promiscuous mode
Oct  7 16:24:57 np0005474864 NetworkManager[51631]: <info>  [1759868697.4718] device (tap6ed165c8-9f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:24:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:57Z|00284|binding|INFO|Releasing lport 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 from this chassis (sb_readonly=0)
Oct  7 16:24:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:57Z|00285|binding|INFO|Setting lport 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 down in Southbound
Oct  7 16:24:57 np0005474864 ovn_controller[94801]: 2025-10-07T20:24:57Z|00286|binding|INFO|Removing iface tap6ed165c8-9f ovn-installed in OVS
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.481 2 DEBUG nova.compute.manager [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-changed-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.481 2 DEBUG nova.compute.manager [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Refreshing instance network info cache due to event network-changed-6ed165c8-9f03-473d-9ce0-008ebbb2ad82. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.482 2 DEBUG oslo_concurrency.lockutils [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.482 2 DEBUG oslo_concurrency.lockutils [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.482 2 DEBUG nova.network.neutron [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Refreshing network info cache for port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.486 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ff:9e:d9 10.100.0.6 2001:db8:0:1:f816:3eff:feff:9ed9 2001:db8::f816:3eff:feff:9ed9'], port_security=['fa:16:3e:ff:9e:d9 10.100.0.6 2001:db8:0:1:f816:3eff:feff:9ed9 2001:db8::f816:3eff:feff:9ed9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28 2001:db8:0:1:f816:3eff:feff:9ed9/64 2001:db8::f816:3eff:feff:9ed9/64', 'neutron:device_id': '07f750b6-5548-4357-b8c0-426ee842fd13', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fbea62e8-afa6-48f6-8f8e-1b935b8ad5da', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=1d5b554d-fe47-46fd-9f7a-274db91b3c84, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=6ed165c8-9f03-473d-9ce0-008ebbb2ad82) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.488 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82 in datapath c77dfc09-a940-4330-b50f-d7b09c70d5c0 unbound from our chassis#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.489 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c77dfc09-a940-4330-b50f-d7b09c70d5c0, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.490 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ff61b38e-45f0-4862-8ec5-adbe0711fca7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.491 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0 namespace which is not needed anymore#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000036.scope: Deactivated successfully.
Oct  7 16:24:57 np0005474864 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000036.scope: Consumed 12.966s CPU time.
Oct  7 16:24:57 np0005474864 systemd-machined[152586]: Machine qemu-19-instance-00000036 terminated.
Oct  7 16:24:57 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [NOTICE]   (230400) : haproxy version is 2.8.14-c23fe91
Oct  7 16:24:57 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [NOTICE]   (230400) : path to executable is /usr/sbin/haproxy
Oct  7 16:24:57 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [WARNING]  (230400) : Exiting Master process...
Oct  7 16:24:57 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [WARNING]  (230400) : Exiting Master process...
Oct  7 16:24:57 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [ALERT]    (230400) : Current worker (230402) exited with code 143 (Terminated)
Oct  7 16:24:57 np0005474864 neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0[230396]: [WARNING]  (230400) : All workers exited. Exiting... (0)
Oct  7 16:24:57 np0005474864 systemd[1]: libpod-ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6.scope: Deactivated successfully.
Oct  7 16:24:57 np0005474864 podman[230596]: 2025-10-07 20:24:57.656847031 +0000 UTC m=+0.059899014 container died ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.666 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.673 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6-userdata-shm.mount: Deactivated successfully.
Oct  7 16:24:57 np0005474864 systemd[1]: var-lib-containers-storage-overlay-580fac30dbde053e3d10bd03d98320f5b4a1878cc8af323a5ac32548f35a756b-merged.mount: Deactivated successfully.
Oct  7 16:24:57 np0005474864 podman[230596]: 2025-10-07 20:24:57.710043041 +0000 UTC m=+0.113095024 container cleanup ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.718 2 INFO nova.virt.libvirt.driver [-] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Instance destroyed successfully.#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.719 2 DEBUG nova.objects.instance [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'resources' on Instance uuid 07f750b6-5548-4357-b8c0-426ee842fd13 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.732 2 DEBUG nova.virt.libvirt.vif [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:24:24Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1047191865',display_name='tempest-TestGettingAddress-server-1047191865',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1047191865',id=54,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBODApuobVWheYyTm3OBTjSn4TO/V3tPPQJs6pnABxyGhko3e1WXcNE+wSqvg7lzxnTDu6x2KD1f3a91mdVG/0EiHUoMJ2bm+0slGalNli+RQ8BMC263wwvs8kCY7EtokcQ==',key_name='tempest-TestGettingAddress-891638733',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:24:33Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-7503jqbq',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:24:33Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=07f750b6-5548-4357-b8c0-426ee842fd13,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.733 2 DEBUG nova.network.os_vif_util [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.245", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:24:57 np0005474864 systemd[1]: libpod-conmon-ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6.scope: Deactivated successfully.
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.736 2 DEBUG nova.network.os_vif_util [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ff:9e:d9,bridge_name='br-int',has_traffic_filtering=True,id=6ed165c8-9f03-473d-9ce0-008ebbb2ad82,network=Network(c77dfc09-a940-4330-b50f-d7b09c70d5c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ed165c8-9f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.736 2 DEBUG os_vif [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:9e:d9,bridge_name='br-int',has_traffic_filtering=True,id=6ed165c8-9f03-473d-9ce0-008ebbb2ad82,network=Network(c77dfc09-a940-4330-b50f-d7b09c70d5c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ed165c8-9f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.739 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.739 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6ed165c8-9f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.744 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.748 2 INFO os_vif [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ff:9e:d9,bridge_name='br-int',has_traffic_filtering=True,id=6ed165c8-9f03-473d-9ce0-008ebbb2ad82,network=Network(c77dfc09-a940-4330-b50f-d7b09c70d5c0),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6ed165c8-9f')#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.748 2 INFO nova.virt.libvirt.driver [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Deleting instance files /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13_del#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.749 2 INFO nova.virt.libvirt.driver [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Deletion of /var/lib/nova/instances/07f750b6-5548-4357-b8c0-426ee842fd13_del complete#033[00m
Oct  7 16:24:57 np0005474864 podman[230642]: 2025-10-07 20:24:57.781658561 +0000 UTC m=+0.042509394 container remove ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.786 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[660609c8-61ae-4eca-a730-204968ce65c3]: (4, ('Tue Oct  7 08:24:57 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0 (ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6)\nad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6\nTue Oct  7 08:24:57 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0 (ad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6)\nad137e4405282bc701338d20fdab19423584ef7d620685952a97082b24c223b6\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.788 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[63e8cb78-349f-4ca4-bcae-8e8d91daeb0f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.789 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc77dfc09-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:24:57 np0005474864 kernel: tapc77dfc09-a0: left promiscuous mode
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.790 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.803 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.805 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[cfdc318e-cab3-410b-9588-8a2b1eae41ef]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.807 2 INFO nova.compute.manager [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.809 2 DEBUG oslo.service.loopingcall [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.809 2 DEBUG nova.compute.manager [-] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:24:57 np0005474864 nova_compute[192593]: 2025-10-07 20:24:57.810 2 DEBUG nova.network.neutron [-] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.834 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[b4d9137c-a256-4e09-b7aa-7522c55d219c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.836 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d98895a7-91fe-44df-8b6a-2f4840a52ef1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.862 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[fccf2f96-f61a-4561-a35f-8792b5457de9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 430714, 'reachable_time': 15638, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230657, 'error': None, 'target': 'ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:57 np0005474864 systemd[1]: run-netns-ovnmeta\x2dc77dfc09\x2da940\x2d4330\x2db50f\x2dd7b09c70d5c0.mount: Deactivated successfully.
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.866 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c77dfc09-a940-4330-b50f-d7b09c70d5c0 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:24:57 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:24:57.867 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[cf7f0969-8983-4481-bb9b-166606b2056c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.505 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.586 2 DEBUG nova.compute.manager [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-vif-unplugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.586 2 DEBUG oslo_concurrency.lockutils [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.587 2 DEBUG oslo_concurrency.lockutils [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.587 2 DEBUG oslo_concurrency.lockutils [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.587 2 DEBUG nova.compute.manager [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] No waiting events found dispatching network-vif-unplugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.588 2 DEBUG nova.compute.manager [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-vif-unplugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.588 2 DEBUG nova.compute.manager [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.589 2 DEBUG oslo_concurrency.lockutils [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.589 2 DEBUG oslo_concurrency.lockutils [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.589 2 DEBUG oslo_concurrency.lockutils [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.590 2 DEBUG nova.compute.manager [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] No waiting events found dispatching network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.590 2 WARNING nova.compute.manager [req-b8b3c547-f7e8-48e6-a3fd-443bb51a214d req-491be1b2-3989-4983-b09f-5911668b2301 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received unexpected event network-vif-plugged-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 for instance with vm_state active and task_state deleting.#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.736 2 DEBUG nova.network.neutron [-] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.756 2 INFO nova.compute.manager [-] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Took 1.95 seconds to deallocate network for instance.#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.818 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.818 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.890 2 DEBUG nova.compute.provider_tree [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.912 2 DEBUG nova.scheduler.client.report [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.940 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:24:59 np0005474864 nova_compute[192593]: 2025-10-07 20:24:59.979 2 INFO nova.scheduler.client.report [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Deleted allocations for instance 07f750b6-5548-4357-b8c0-426ee842fd13#033[00m
Oct  7 16:25:00 np0005474864 nova_compute[192593]: 2025-10-07 20:25:00.076 2 DEBUG oslo_concurrency.lockutils [None req-a8fd8e8f-7510-45ef-9ede-4bbc3678d72d 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "07f750b6-5548-4357-b8c0-426ee842fd13" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.634s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:01 np0005474864 nova_compute[192593]: 2025-10-07 20:25:01.284 2 DEBUG nova.network.neutron [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updated VIF entry in instance network info cache for port 6ed165c8-9f03-473d-9ce0-008ebbb2ad82. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:25:01 np0005474864 nova_compute[192593]: 2025-10-07 20:25:01.284 2 DEBUG nova.network.neutron [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Updating instance_info_cache with network_info: [{"id": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "address": "fa:16:3e:ff:9e:d9", "network": {"id": "c77dfc09-a940-4330-b50f-d7b09c70d5c0", "bridge": "br-int", "label": "tempest-network-smoke--1165022391", "subnets": [{"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feff:9ed9", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6ed165c8-9f", "ovs_interfaceid": "6ed165c8-9f03-473d-9ce0-008ebbb2ad82", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:25:01 np0005474864 nova_compute[192593]: 2025-10-07 20:25:01.309 2 DEBUG oslo_concurrency.lockutils [req-770db0d4-0f46-44c5-ad32-e8ce10953c11 req-1337d9fc-b949-41e4-bf5b-58acbd4d7780 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-07f750b6-5548-4357-b8c0-426ee842fd13" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:25:01 np0005474864 podman[230658]: 2025-10-07 20:25:01.366757146 +0000 UTC m=+0.059816541 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:25:01 np0005474864 nova_compute[192593]: 2025-10-07 20:25:01.693 2 DEBUG nova.compute.manager [req-ef8bcb5b-54b0-40dd-a098-a3bdd9f01c2a req-714c91f6-b442-47e7-bd22-72254c6c90e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Received event network-vif-deleted-6ed165c8-9f03-473d-9ce0-008ebbb2ad82 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:25:02 np0005474864 nova_compute[192593]: 2025-10-07 20:25:02.743 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:04 np0005474864 nova_compute[192593]: 2025-10-07 20:25:04.510 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:06 np0005474864 podman[230684]: 2025-10-07 20:25:06.396785714 +0000 UTC m=+0.076852991 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:25:07 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:07.178 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:25:07 np0005474864 nova_compute[192593]: 2025-10-07 20:25:07.178 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:07 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:07.179 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:25:07 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:07.179 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:25:07 np0005474864 nova_compute[192593]: 2025-10-07 20:25:07.746 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:09 np0005474864 nova_compute[192593]: 2025-10-07 20:25:09.512 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:10 np0005474864 nova_compute[192593]: 2025-10-07 20:25:10.578 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:10 np0005474864 nova_compute[192593]: 2025-10-07 20:25:10.697 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:12 np0005474864 nova_compute[192593]: 2025-10-07 20:25:12.711 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868697.7096949, 07f750b6-5548-4357-b8c0-426ee842fd13 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:25:12 np0005474864 nova_compute[192593]: 2025-10-07 20:25:12.712 2 INFO nova.compute.manager [-] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:25:12 np0005474864 nova_compute[192593]: 2025-10-07 20:25:12.743 2 DEBUG nova.compute.manager [None req-961b3495-4c6e-4ce2-b327-ff0df05a43a6 - - - - - -] [instance: 07f750b6-5548-4357-b8c0-426ee842fd13] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:25:12 np0005474864 nova_compute[192593]: 2025-10-07 20:25:12.792 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:14 np0005474864 nova_compute[192593]: 2025-10-07 20:25:14.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:16.201 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:16.202 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:16.202 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:17 np0005474864 nova_compute[192593]: 2025-10-07 20:25:17.796 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:19 np0005474864 podman[230707]: 2025-10-07 20:25:19.416845713 +0000 UTC m=+0.100952065 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, maintainer=Red Hat, Inc., vcs-type=git, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, name=ubi9-minimal, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:25:19 np0005474864 podman[230706]: 2025-10-07 20:25:19.417367008 +0000 UTC m=+0.099484313 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:25:19 np0005474864 nova_compute[192593]: 2025-10-07 20:25:19.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:22 np0005474864 nova_compute[192593]: 2025-10-07 20:25:22.799 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:23 np0005474864 podman[230751]: 2025-10-07 20:25:23.417870132 +0000 UTC m=+0.099772551 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  7 16:25:23 np0005474864 podman[230749]: 2025-10-07 20:25:23.421786025 +0000 UTC m=+0.108563125 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=iscsid)
Oct  7 16:25:23 np0005474864 podman[230750]: 2025-10-07 20:25:23.435472088 +0000 UTC m=+0.124622566 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:25:24 np0005474864 nova_compute[192593]: 2025-10-07 20:25:24.525 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:27 np0005474864 nova_compute[192593]: 2025-10-07 20:25:27.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:28 np0005474864 podman[230810]: 2025-10-07 20:25:28.386806473 +0000 UTC m=+0.077775479 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:25:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:28.776 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:2d:16:81 10.100.0.2 2001:db8::f816:3eff:fe2d:1681'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe2d:1681/64', 'neutron:device_id': 'ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e00660f2-0bb7-4ca4-9bc4-6a9508a9764e, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=fe54a6cd-5298-43b0-b430-285b56a30360) old=Port_Binding(mac=['fa:16:3e:2d:16:81 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:25:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:28.778 103685 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port fe54a6cd-5298-43b0-b430-285b56a30360 in datapath b5fbe464-abae-4582-802d-66b1adc9bc5d updated#033[00m
Oct  7 16:25:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:28.780 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5fbe464-abae-4582-802d-66b1adc9bc5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:25:28 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:28.782 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6018bdd2-61e9-4460-9e24-4451bde8f7a4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:29 np0005474864 nova_compute[192593]: 2025-10-07 20:25:29.540 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:30 np0005474864 nova_compute[192593]: 2025-10-07 20:25:30.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:30 np0005474864 nova_compute[192593]: 2025-10-07 20:25:30.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.123 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.124 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.124 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.125 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.366 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.368 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5750MB free_disk=73.45462799072266GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.369 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.369 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.432 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.433 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.455 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing inventories for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.472 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating ProviderTree inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.473 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.491 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing aggregate associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.526 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing trait associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.549 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.567 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.587 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:25:31 np0005474864 nova_compute[192593]: 2025-10-07 20:25:31.588 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.219s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:32 np0005474864 podman[230830]: 2025-10-07 20:25:32.373033057 +0000 UTC m=+0.066769122 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Oct  7 16:25:32 np0005474864 nova_compute[192593]: 2025-10-07 20:25:32.584 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:32 np0005474864 nova_compute[192593]: 2025-10-07 20:25:32.805 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.166 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.167 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.186 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.285 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.286 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.292 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.293 2 INFO nova.compute.claims [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Claim successful on node compute-2.ctlplane.example.com#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.423 2 DEBUG nova.compute.provider_tree [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.441 2 DEBUG nova.scheduler.client.report [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.472 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.187s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.474 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.528 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.528 2 DEBUG nova.network.neutron [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.541 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.547 2 INFO nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.572 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.701 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.703 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.703 2 INFO nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Creating image(s)#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.703 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "/var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.704 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.704 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "/var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.717 2 DEBUG nova.policy [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.719 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.807 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.087s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.809 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "e61835ff717abc381cf6f37b3b2d562fe207343b" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.810 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.835 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.918 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.083s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.919 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.977 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b,backing_fmt=raw /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk 1073741824" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.979 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "e61835ff717abc381cf6f37b3b2d562fe207343b" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:34 np0005474864 nova_compute[192593]: 2025-10-07 20:25:34.981 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.074 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/e61835ff717abc381cf6f37b3b2d562fe207343b --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.076 2 DEBUG nova.virt.disk.api [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Checking if we can resize image /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.076 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.105 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.152 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.153 2 DEBUG nova.virt.disk.api [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Cannot resize image /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.153 2 DEBUG nova.objects.instance [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'migration_context' on Instance uuid 9db039b2-5ec0-490b-bb78-b07d2f6341c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.167 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.168 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Ensure instance console log exists: /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.169 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.169 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:35 np0005474864 nova_compute[192593]: 2025-10-07 20:25:35.170 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:36 np0005474864 nova_compute[192593]: 2025-10-07 20:25:36.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:36 np0005474864 nova_compute[192593]: 2025-10-07 20:25:36.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:25:36 np0005474864 nova_compute[192593]: 2025-10-07 20:25:36.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:25:36 np0005474864 nova_compute[192593]: 2025-10-07 20:25:36.118 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Skipping network cache update for instance because it is Building. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9871#033[00m
Oct  7 16:25:36 np0005474864 nova_compute[192593]: 2025-10-07 20:25:36.119 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:25:36 np0005474864 nova_compute[192593]: 2025-10-07 20:25:36.144 2 DEBUG nova.network.neutron [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Successfully created port: 72451f99-693c-42bb-bddb-f2b4b09ac4fd _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Oct  7 16:25:37 np0005474864 nova_compute[192593]: 2025-10-07 20:25:37.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:37 np0005474864 podman[230869]: 2025-10-07 20:25:37.38899365 +0000 UTC m=+0.089139335 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:25:37 np0005474864 nova_compute[192593]: 2025-10-07 20:25:37.808 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.125 2 DEBUG nova.network.neutron [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Successfully updated port: 72451f99-693c-42bb-bddb-f2b4b09ac4fd _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.145 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.145 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquired lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.145 2 DEBUG nova.network.neutron [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.246 2 DEBUG nova.compute.manager [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-changed-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.247 2 DEBUG nova.compute.manager [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Refreshing instance network info cache due to event network-changed-72451f99-693c-42bb-bddb-f2b4b09ac4fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.247 2 DEBUG oslo_concurrency.lockutils [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:25:38 np0005474864 nova_compute[192593]: 2025-10-07 20:25:38.339 2 DEBUG nova.network.neutron [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Oct  7 16:25:39 np0005474864 nova_compute[192593]: 2025-10-07 20:25:39.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:39 np0005474864 nova_compute[192593]: 2025-10-07 20:25:39.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:25:39 np0005474864 nova_compute[192593]: 2025-10-07 20:25:39.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:25:39 np0005474864 nova_compute[192593]: 2025-10-07 20:25:39.576 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.749 2 DEBUG nova.network.neutron [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updating instance_info_cache with network_info: [{"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.787 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Releasing lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.788 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Instance network_info: |[{"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.789 2 DEBUG oslo_concurrency.lockutils [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.789 2 DEBUG nova.network.neutron [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Refreshing network info cache for port 72451f99-693c-42bb-bddb-f2b4b09ac4fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.794 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Start _get_guest_xml network_info=[{"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'device_name': '/dev/vda', 'boot_index': 0, 'guest_format': None, 'encryption_options': None, 'encryption_format': None, 'size': 0, 'disk_bus': 'virtio', 'encryption_secret_uuid': None, 'encrypted': False, 'device_type': 'disk', 'image_id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.800 2 WARNING nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.805 2 DEBUG nova.virt.libvirt.host [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.806 2 DEBUG nova.virt.libvirt.host [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.817 2 DEBUG nova.virt.libvirt.host [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.818 2 DEBUG nova.virt.libvirt.host [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.820 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.821 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-10-07T20:09:16Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='3fec056a-1226-48ad-a02c-e4fe097a9363',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-10-07T20:09:18Z,direct_url=<?>,disk_format='qcow2',id=3c70ce5f-6f9a-4def-9c79-e5a33d631679,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='0382839ef71e4276aa5e57e3b819687c',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-10-07T20:09:19Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.822 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.822 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.822 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.823 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.823 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.824 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.824 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.824 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.825 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.825 2 DEBUG nova.virt.hardware [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.832 2 DEBUG nova.virt.libvirt.vif [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:25:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1984576638',display_name='tempest-TestGettingAddress-server-1984576638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1984576638',id=56,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXMC9BHV4qYivbDJxGRtnm/v2GS7hcRhCb8qo5+Gp9JpLYRJPsnY0G6yPoLD7SXJAgR2NH3iQzGauZI0CtfWyA4PpJm+ved5s7oNOrv+n1zHgrMx00oWwi21LGuUsNrTw==',key_name='tempest-TestGettingAddress-422350208',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-25nx8bcz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:25:34Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=9db039b2-5ec0-490b-bb78-b07d2f6341c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.832 2 DEBUG nova.network.os_vif_util [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.834 2 DEBUG nova.network.os_vif_util [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:98:a1,bridge_name='br-int',has_traffic_filtering=True,id=72451f99-693c-42bb-bddb-f2b4b09ac4fd,network=Network(b5fbe464-abae-4582-802d-66b1adc9bc5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72451f99-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.835 2 DEBUG nova.objects.instance [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'pci_devices' on Instance uuid 9db039b2-5ec0-490b-bb78-b07d2f6341c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.856 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] End _get_guest_xml xml=<domain type="kvm">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <uuid>9db039b2-5ec0-490b-bb78-b07d2f6341c1</uuid>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <name>instance-00000038</name>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <memory>131072</memory>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <vcpu>1</vcpu>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <metadata>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <nova:name>tempest-TestGettingAddress-server-1984576638</nova:name>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <nova:creationTime>2025-10-07 20:25:40</nova:creationTime>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <nova:flavor name="m1.nano">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:memory>128</nova:memory>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:disk>1</nova:disk>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:swap>0</nova:swap>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:ephemeral>0</nova:ephemeral>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:vcpus>1</nova:vcpus>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      </nova:flavor>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <nova:owner>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:user uuid="334f092941fc46c496c7def76b2cfe18">tempest-TestGettingAddress-626136673-project-member</nova:user>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:project uuid="2f9bf744045540618c9980fd4a7694f5">tempest-TestGettingAddress-626136673</nova:project>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      </nova:owner>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <nova:root type="image" uuid="3c70ce5f-6f9a-4def-9c79-e5a33d631679"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <nova:ports>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        <nova:port uuid="72451f99-693c-42bb-bddb-f2b4b09ac4fd">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe0e:98a1" ipVersion="6"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:        </nova:port>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      </nova:ports>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </nova:instance>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  </metadata>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <sysinfo type="smbios">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <system>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <entry name="manufacturer">RDO</entry>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <entry name="product">OpenStack Compute</entry>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <entry name="serial">9db039b2-5ec0-490b-bb78-b07d2f6341c1</entry>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <entry name="uuid">9db039b2-5ec0-490b-bb78-b07d2f6341c1</entry>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <entry name="family">Virtual Machine</entry>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </system>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  </sysinfo>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <os>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <type arch="x86_64" machine="q35">hvm</type>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <boot dev="hd"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <smbios mode="sysinfo"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  </os>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <features>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <acpi/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <apic/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <vmcoreinfo/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  </features>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <clock offset="utc">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <timer name="pit" tickpolicy="delay"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <timer name="rtc" tickpolicy="catchup"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <timer name="hpet" present="no"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  </clock>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <cpu mode="custom" match="exact">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <model>Nehalem</model>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <topology sockets="1" cores="1" threads="1"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  </cpu>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  <devices>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <disk type="file" device="disk">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <driver name="qemu" type="qcow2" cache="none"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <target dev="vda" bus="virtio"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <disk type="file" device="cdrom">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <driver name="qemu" type="raw" cache="none"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <source file="/var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.config"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <target dev="sda" bus="sata"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </disk>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <interface type="ethernet">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <mac address="fa:16:3e:0e:98:a1"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <driver name="vhost" rx_queue_size="512"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <mtu size="1442"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <target dev="tap72451f99-69"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </interface>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <serial type="pty">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <log file="/var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/console.log" append="off"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </serial>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <video>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <model type="virtio"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </video>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <input type="tablet" bus="usb"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <rng model="virtio">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <backend model="random">/dev/urandom</backend>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </rng>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="pci" model="pcie-root-port"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <controller type="usb" index="0"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    <memballoon model="virtio">
Oct  7 16:25:40 np0005474864 nova_compute[192593]:      <stats period="10"/>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:    </memballoon>
Oct  7 16:25:40 np0005474864 nova_compute[192593]:  </devices>
Oct  7 16:25:40 np0005474864 nova_compute[192593]: </domain>
Oct  7 16:25:40 np0005474864 nova_compute[192593]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.859 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Preparing to wait for external event network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.859 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.859 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.860 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.861 2 DEBUG nova.virt.libvirt.vif [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-10-07T20:25:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1984576638',display_name='tempest-TestGettingAddress-server-1984576638',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1984576638',id=56,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXMC9BHV4qYivbDJxGRtnm/v2GS7hcRhCb8qo5+Gp9JpLYRJPsnY0G6yPoLD7SXJAgR2NH3iQzGauZI0CtfWyA4PpJm+ved5s7oNOrv+n1zHgrMx00oWwi21LGuUsNrTw==',key_name='tempest-TestGettingAddress-422350208',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-25nx8bcz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-10-07T20:25:34Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=9db039b2-5ec0-490b-bb78-b07d2f6341c1,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.862 2 DEBUG nova.network.os_vif_util [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.863 2 DEBUG nova.network.os_vif_util [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:0e:98:a1,bridge_name='br-int',has_traffic_filtering=True,id=72451f99-693c-42bb-bddb-f2b4b09ac4fd,network=Network(b5fbe464-abae-4582-802d-66b1adc9bc5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72451f99-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.864 2 DEBUG os_vif [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:98:a1,bridge_name='br-int',has_traffic_filtering=True,id=72451f99-693c-42bb-bddb-f2b4b09ac4fd,network=Network(b5fbe464-abae-4582-802d-66b1adc9bc5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72451f99-69') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.865 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.865 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.866 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.872 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.873 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap72451f99-69, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.873 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap72451f99-69, col_values=(('external_ids', {'iface-id': '72451f99-693c-42bb-bddb-f2b4b09ac4fd', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:0e:98:a1', 'vm-uuid': '9db039b2-5ec0-490b-bb78-b07d2f6341c1'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.876 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:40 np0005474864 NetworkManager[51631]: <info>  [1759868740.8780] manager: (tap72451f99-69): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.879 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.886 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:40 np0005474864 nova_compute[192593]: 2025-10-07 20:25:40.888 2 INFO os_vif [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:0e:98:a1,bridge_name='br-int',has_traffic_filtering=True,id=72451f99-693c-42bb-bddb-f2b4b09ac4fd,network=Network(b5fbe464-abae-4582-802d-66b1adc9bc5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72451f99-69')#033[00m
Oct  7 16:25:41 np0005474864 nova_compute[192593]: 2025-10-07 20:25:41.178 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:25:41 np0005474864 nova_compute[192593]: 2025-10-07 20:25:41.179 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Oct  7 16:25:41 np0005474864 nova_compute[192593]: 2025-10-07 20:25:41.179 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] No VIF found with MAC fa:16:3e:0e:98:a1, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Oct  7 16:25:41 np0005474864 nova_compute[192593]: 2025-10-07 20:25:41.180 2 INFO nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Using config drive#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.036 2 INFO nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Creating config drive at /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.config#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.047 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_i8giql8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.189 2 DEBUG oslo_concurrency.processutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_i8giql8" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:25:42 np0005474864 kernel: tap72451f99-69: entered promiscuous mode
Oct  7 16:25:42 np0005474864 NetworkManager[51631]: <info>  [1759868742.2706] manager: (tap72451f99-69): new Tun device (/org/freedesktop/NetworkManager/Devices/150)
Oct  7 16:25:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:42Z|00287|binding|INFO|Claiming lport 72451f99-693c-42bb-bddb-f2b4b09ac4fd for this chassis.
Oct  7 16:25:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:42Z|00288|binding|INFO|72451f99-693c-42bb-bddb-f2b4b09ac4fd: Claiming fa:16:3e:0e:98:a1 10.100.0.7 2001:db8::f816:3eff:fe0e:98a1
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.270 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.274 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.288 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.304 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:98:a1 10.100.0.7 2001:db8::f816:3eff:fe0e:98a1'], port_security=['fa:16:3e:0e:98:a1 10.100.0.7 2001:db8::f816:3eff:fe0e:98a1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8::f816:3eff:fe0e:98a1/64', 'neutron:device_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '2', 'neutron:security_group_ids': '6b3e328b-3967-4ad5-a38e-2a20145637ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e00660f2-0bb7-4ca4-9bc4-6a9508a9764e, chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=72451f99-693c-42bb-bddb-f2b4b09ac4fd) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.306 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 72451f99-693c-42bb-bddb-f2b4b09ac4fd in datapath b5fbe464-abae-4582-802d-66b1adc9bc5d bound to our chassis#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.309 103685 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b5fbe464-abae-4582-802d-66b1adc9bc5d#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.325 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[43c86e65-6ca5-4b0f-a9ae-3c3d85a3a0b5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.326 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb5fbe464-a1 in ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Oct  7 16:25:42 np0005474864 systemd-udevd[230911]: Network interface NamePolicy= disabled on kernel command line.
Oct  7 16:25:42 np0005474864 systemd-machined[152586]: New machine qemu-20-instance-00000038.
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.330 220243 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb5fbe464-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.331 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[21cba720-e0ec-43f4-8a62-3255536b7e30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.332 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[dd23842f-854c-4f1a-9072-c627d471143c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 NetworkManager[51631]: <info>  [1759868742.3457] device (tap72451f99-69): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Oct  7 16:25:42 np0005474864 NetworkManager[51631]: <info>  [1759868742.3478] device (tap72451f99-69): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Oct  7 16:25:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:42Z|00289|binding|INFO|Setting lport 72451f99-693c-42bb-bddb-f2b4b09ac4fd ovn-installed in OVS
Oct  7 16:25:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:42Z|00290|binding|INFO|Setting lport 72451f99-693c-42bb-bddb-f2b4b09ac4fd up in Southbound
Oct  7 16:25:42 np0005474864 systemd[1]: Started Virtual Machine qemu-20-instance-00000038.
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.353 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[dca1e320-4a70-4ced-8f9f-31e05219f4df]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.389 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[8da80c41-c38e-4f1f-8dd1-9aeb037f4511]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.443 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[06b8b061-e6d1-4c98-911b-7aa9e886106d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 NetworkManager[51631]: <info>  [1759868742.4524] manager: (tapb5fbe464-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/151)
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.452 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4d7e9566-1f1b-43ac-8f10-0fa150a4b31f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.496 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[8a9a0e02-78ee-4616-a91f-13d79695610c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.502 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[e287acb9-b87f-4ff0-86c9-4e0db6635011]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 NetworkManager[51631]: <info>  [1759868742.5375] device (tapb5fbe464-a0): carrier: link connected
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.548 220454 DEBUG oslo.privsep.daemon [-] privsep: reply[c63138aa-d929-4fc2-bb39-717ec0e88e02]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.582 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[922036fc-b1d7-4a55-aaae-1ec36728fcbb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5fbe464-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:16:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437843, 'reachable_time': 38165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 230943, 'error': None, 'target': 'ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.605 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ebdca446-88e9-4d7e-887f-ae3764f03f6a]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2d:1681'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 437843, 'tstamp': 437843}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 230944, 'error': None, 'target': 'ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.631 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2fbe8088-0e38-4848-a26b-2c16909d7c76]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb5fbe464-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2d:16:81'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 110, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 86], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437843, 'reachable_time': 38165, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 96, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 96, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 230945, 'error': None, 'target': 'ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.665 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[74fcbbbb-3533-4422-9bd7-3fb0ec4749ef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.739 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[804c24dc-d8d4-4779-84a9-03acea853edf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.741 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5fbe464-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.741 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.742 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb5fbe464-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:25:42 np0005474864 kernel: tapb5fbe464-a0: entered promiscuous mode
Oct  7 16:25:42 np0005474864 NetworkManager[51631]: <info>  [1759868742.7455] manager: (tapb5fbe464-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.749 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb5fbe464-a0, col_values=(('external_ids', {'iface-id': 'fe54a6cd-5298-43b0-b430-285b56a30360'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:25:42 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:42Z|00291|binding|INFO|Releasing lport fe54a6cd-5298-43b0-b430-285b56a30360 from this chassis (sb_readonly=0)
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.752 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.777 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.779 103685 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b5fbe464-abae-4582-802d-66b1adc9bc5d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b5fbe464-abae-4582-802d-66b1adc9bc5d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.780 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[ec89db2e-a2b0-49d2-a864-d6db8bfcf165]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.782 103685 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: global
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    log         /dev/log local0 debug
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    log-tag     haproxy-metadata-proxy-b5fbe464-abae-4582-802d-66b1adc9bc5d
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    user        root
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    group       root
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    maxconn     1024
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    pidfile     /var/lib/neutron/external/pids/b5fbe464-abae-4582-802d-66b1adc9bc5d.pid.haproxy
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    daemon
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: defaults
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    log global
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    mode http
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    option httplog
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    option dontlognull
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    option http-server-close
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    option forwardfor
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    retries                 3
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    timeout http-request    30s
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    timeout connect         30s
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    timeout client          32s
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    timeout server          32s
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    timeout http-keep-alive 30s
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: listen listener
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    bind 169.254.169.254:80
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    server metadata /var/lib/neutron/metadata_proxy
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]:    http-request add-header X-OVN-Network-ID b5fbe464-abae-4582-802d-66b1adc9bc5d
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Oct  7 16:25:42 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:25:42.783 103685 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'env', 'PROCESS_TAG=haproxy-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b5fbe464-abae-4582-802d-66b1adc9bc5d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.921 2 DEBUG nova.compute.manager [req-bf2b82c5-d1c0-4cc4-87c8-d62482cfaf67 req-12b76486-0207-4b9c-8c57-ec92e72fde2f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.923 2 DEBUG oslo_concurrency.lockutils [req-bf2b82c5-d1c0-4cc4-87c8-d62482cfaf67 req-12b76486-0207-4b9c-8c57-ec92e72fde2f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.924 2 DEBUG oslo_concurrency.lockutils [req-bf2b82c5-d1c0-4cc4-87c8-d62482cfaf67 req-12b76486-0207-4b9c-8c57-ec92e72fde2f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.924 2 DEBUG oslo_concurrency.lockutils [req-bf2b82c5-d1c0-4cc4-87c8-d62482cfaf67 req-12b76486-0207-4b9c-8c57-ec92e72fde2f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:42 np0005474864 nova_compute[192593]: 2025-10-07 20:25:42.925 2 DEBUG nova.compute.manager [req-bf2b82c5-d1c0-4cc4-87c8-d62482cfaf67 req-12b76486-0207-4b9c-8c57-ec92e72fde2f 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Processing event network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Oct  7 16:25:43 np0005474864 podman[230984]: 2025-10-07 20:25:43.198662052 +0000 UTC m=+0.062879799 container create c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  7 16:25:43 np0005474864 systemd[1]: Started libpod-conmon-c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839.scope.
Oct  7 16:25:43 np0005474864 podman[230984]: 2025-10-07 20:25:43.166688452 +0000 UTC m=+0.030906219 image pull 26280da617d52ac64ac1fa9a18a315d65ac237c1373028f8064008a821dbfd8d quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Oct  7 16:25:43 np0005474864 systemd[1]: Started libcrun container.
Oct  7 16:25:43 np0005474864 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bd2945e0d490cd27160323285c6b242905bba1175b5625f732a99474c8eadd85/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.309 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868743.3088505, 9db039b2-5ec0-490b-bb78-b07d2f6341c1 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.312 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] VM Started (Lifecycle Event)#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.314 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.317 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.321 2 INFO nova.virt.libvirt.driver [-] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Instance spawned successfully.#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.321 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Oct  7 16:25:43 np0005474864 podman[230984]: 2025-10-07 20:25:43.326558211 +0000 UTC m=+0.190775988 container init c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:25:43 np0005474864 podman[230984]: 2025-10-07 20:25:43.332686557 +0000 UTC m=+0.196904304 container start c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Oct  7 16:25:43 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [NOTICE]   (231003) : New worker (231005) forked
Oct  7 16:25:43 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [NOTICE]   (231003) : Loading success.
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.358 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.363 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.364 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.365 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.365 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.366 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.366 2 DEBUG nova.virt.libvirt.driver [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.372 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.404 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.404 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868743.3107834, 9db039b2-5ec0-490b-bb78-b07d2f6341c1 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.404 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] VM Paused (Lifecycle Event)#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.435 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.438 2 DEBUG nova.virt.driver [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] Emitting event <LifecycleEvent: 1759868743.3167377, 9db039b2-5ec0-490b-bb78-b07d2f6341c1 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.438 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] VM Resumed (Lifecycle Event)#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.450 2 INFO nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Took 8.75 seconds to spawn the instance on the hypervisor.#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.450 2 DEBUG nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.460 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.462 2 DEBUG nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.489 2 INFO nova.compute.manager [None req-146ef75d-70b2-403b-81d9-c56bd67c808d - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.518 2 INFO nova.compute.manager [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Took 9.26 seconds to build instance.#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.539 2 DEBUG oslo_concurrency.lockutils [None req-eecd176a-62be-4e85-a49d-1a415131ccb7 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.772 2 DEBUG nova.network.neutron [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updated VIF entry in instance network info cache for port 72451f99-693c-42bb-bddb-f2b4b09ac4fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.772 2 DEBUG nova.network.neutron [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updating instance_info_cache with network_info: [{"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:25:43 np0005474864 nova_compute[192593]: 2025-10-07 20:25:43.788 2 DEBUG oslo_concurrency.lockutils [req-8a400048-963e-4789-a1db-ffb86cf137b5 req-1e4b8bdc-a6b1-42f5-a3e7-c01943dce6ae 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:25:44 np0005474864 nova_compute[192593]: 2025-10-07 20:25:44.579 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:45 np0005474864 nova_compute[192593]: 2025-10-07 20:25:45.357 2 DEBUG nova.compute.manager [req-a41f22aa-ccc8-4650-8ce8-2d2374cb6ed7 req-4e09e5dc-1e74-455f-a3ed-8c07a67f96dc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:25:45 np0005474864 nova_compute[192593]: 2025-10-07 20:25:45.358 2 DEBUG oslo_concurrency.lockutils [req-a41f22aa-ccc8-4650-8ce8-2d2374cb6ed7 req-4e09e5dc-1e74-455f-a3ed-8c07a67f96dc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:25:45 np0005474864 nova_compute[192593]: 2025-10-07 20:25:45.358 2 DEBUG oslo_concurrency.lockutils [req-a41f22aa-ccc8-4650-8ce8-2d2374cb6ed7 req-4e09e5dc-1e74-455f-a3ed-8c07a67f96dc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:25:45 np0005474864 nova_compute[192593]: 2025-10-07 20:25:45.359 2 DEBUG oslo_concurrency.lockutils [req-a41f22aa-ccc8-4650-8ce8-2d2374cb6ed7 req-4e09e5dc-1e74-455f-a3ed-8c07a67f96dc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:25:45 np0005474864 nova_compute[192593]: 2025-10-07 20:25:45.359 2 DEBUG nova.compute.manager [req-a41f22aa-ccc8-4650-8ce8-2d2374cb6ed7 req-4e09e5dc-1e74-455f-a3ed-8c07a67f96dc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] No waiting events found dispatching network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:25:45 np0005474864 nova_compute[192593]: 2025-10-07 20:25:45.359 2 WARNING nova.compute.manager [req-a41f22aa-ccc8-4650-8ce8-2d2374cb6ed7 req-4e09e5dc-1e74-455f-a3ed-8c07a67f96dc 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received unexpected event network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd for instance with vm_state active and task_state None.#033[00m
Oct  7 16:25:45 np0005474864 nova_compute[192593]: 2025-10-07 20:25:45.878 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:47 np0005474864 nova_compute[192593]: 2025-10-07 20:25:47.507 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:47 np0005474864 NetworkManager[51631]: <info>  [1759868747.5147] manager: (patch-br-int-to-provnet-53a6f422-cce6-4082-98aa-3619989346bc): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/153)
Oct  7 16:25:47 np0005474864 NetworkManager[51631]: <info>  [1759868747.5153] manager: (patch-provnet-53a6f422-cce6-4082-98aa-3619989346bc-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/154)
Oct  7 16:25:47 np0005474864 nova_compute[192593]: 2025-10-07 20:25:47.616 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:47 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:47Z|00292|binding|INFO|Releasing lport fe54a6cd-5298-43b0-b430-285b56a30360 from this chassis (sb_readonly=0)
Oct  7 16:25:47 np0005474864 nova_compute[192593]: 2025-10-07 20:25:47.636 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:48 np0005474864 nova_compute[192593]: 2025-10-07 20:25:48.503 2 DEBUG nova.compute.manager [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-changed-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:25:48 np0005474864 nova_compute[192593]: 2025-10-07 20:25:48.504 2 DEBUG nova.compute.manager [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Refreshing instance network info cache due to event network-changed-72451f99-693c-42bb-bddb-f2b4b09ac4fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:25:48 np0005474864 nova_compute[192593]: 2025-10-07 20:25:48.504 2 DEBUG oslo_concurrency.lockutils [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:25:48 np0005474864 nova_compute[192593]: 2025-10-07 20:25:48.505 2 DEBUG oslo_concurrency.lockutils [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:25:48 np0005474864 nova_compute[192593]: 2025-10-07 20:25:48.505 2 DEBUG nova.network.neutron [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Refreshing network info cache for port 72451f99-693c-42bb-bddb-f2b4b09ac4fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:25:49 np0005474864 nova_compute[192593]: 2025-10-07 20:25:49.632 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:50 np0005474864 nova_compute[192593]: 2025-10-07 20:25:50.052 2 DEBUG nova.network.neutron [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updated VIF entry in instance network info cache for port 72451f99-693c-42bb-bddb-f2b4b09ac4fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:25:50 np0005474864 nova_compute[192593]: 2025-10-07 20:25:50.053 2 DEBUG nova.network.neutron [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updating instance_info_cache with network_info: [{"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:25:50 np0005474864 nova_compute[192593]: 2025-10-07 20:25:50.075 2 DEBUG oslo_concurrency.lockutils [req-85880434-7cc3-46ca-886f-52a00e7407d7 req-c1b6c9dc-f369-4883-8644-1ed1ba575fd4 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:25:50 np0005474864 podman[231015]: 2025-10-07 20:25:50.385237283 +0000 UTC m=+0.068580694 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:25:50 np0005474864 podman[231016]: 2025-10-07 20:25:50.41782106 +0000 UTC m=+0.107675509 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, io.openshift.expose-services=, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:25:50 np0005474864 nova_compute[192593]: 2025-10-07 20:25:50.882 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:54 np0005474864 podman[231071]: 2025-10-07 20:25:54.370373564 +0000 UTC m=+0.061432728 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:25:54 np0005474864 podman[231073]: 2025-10-07 20:25:54.406619966 +0000 UTC m=+0.091070130 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Oct  7 16:25:54 np0005474864 podman[231072]: 2025-10-07 20:25:54.436425104 +0000 UTC m=+0.120590480 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  7 16:25:54 np0005474864 nova_compute[192593]: 2025-10-07 20:25:54.678 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:55 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:55Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:0e:98:a1 10.100.0.7
Oct  7 16:25:55 np0005474864 ovn_controller[94801]: 2025-10-07T20:25:55Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:0e:98:a1 10.100.0.7
Oct  7 16:25:55 np0005474864 nova_compute[192593]: 2025-10-07 20:25:55.915 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:25:59 np0005474864 podman[231138]: 2025-10-07 20:25:59.381605221 +0000 UTC m=+0.069458169 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Oct  7 16:25:59 np0005474864 nova_compute[192593]: 2025-10-07 20:25:59.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:00 np0005474864 nova_compute[192593]: 2025-10-07 20:26:00.928 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:03 np0005474864 podman[231158]: 2025-10-07 20:26:03.368190356 +0000 UTC m=+0.062037746 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:26:04 np0005474864 nova_compute[192593]: 2025-10-07 20:26:04.722 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:05 np0005474864 nova_compute[192593]: 2025-10-07 20:26:05.964 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:08 np0005474864 podman[231183]: 2025-10-07 20:26:08.397196441 +0000 UTC m=+0.086427957 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001)
Oct  7 16:26:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:08.529 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:26:08 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:08.530 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:26:08 np0005474864 nova_compute[192593]: 2025-10-07 20:26:08.563 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:09 np0005474864 nova_compute[192593]: 2025-10-07 20:26:09.763 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:11 np0005474864 nova_compute[192593]: 2025-10-07 20:26:11.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:14 np0005474864 nova_compute[192593]: 2025-10-07 20:26:14.802 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:16 np0005474864 nova_compute[192593]: 2025-10-07 20:26:16.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:16.201 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:16.202 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:16.203 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:18 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:18.532 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:26:19 np0005474864 nova_compute[192593]: 2025-10-07 20:26:19.804 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:21 np0005474864 nova_compute[192593]: 2025-10-07 20:26:21.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:21 np0005474864 ovn_controller[94801]: 2025-10-07T20:26:21Z|00293|binding|INFO|Releasing lport fe54a6cd-5298-43b0-b430-285b56a30360 from this chassis (sb_readonly=0)
Oct  7 16:26:21 np0005474864 nova_compute[192593]: 2025-10-07 20:26:21.333 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:21 np0005474864 podman[231204]: 2025-10-07 20:26:21.418069234 +0000 UTC m=+0.057208037 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, release=1755695350, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  7 16:26:21 np0005474864 podman[231203]: 2025-10-07 20:26:21.444090513 +0000 UTC m=+0.080016733 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:26:24 np0005474864 nova_compute[192593]: 2025-10-07 20:26:24.806 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:25 np0005474864 podman[231249]: 2025-10-07 20:26:25.398075698 +0000 UTC m=+0.072552808 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:26:25 np0005474864 ovn_controller[94801]: 2025-10-07T20:26:25Z|00294|binding|INFO|Releasing lport fe54a6cd-5298-43b0-b430-285b56a30360 from this chassis (sb_readonly=0)
Oct  7 16:26:25 np0005474864 podman[231251]: 2025-10-07 20:26:25.434615249 +0000 UTC m=+0.105294630 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:26:25 np0005474864 podman[231250]: 2025-10-07 20:26:25.455908742 +0000 UTC m=+0.138552717 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:26:25 np0005474864 nova_compute[192593]: 2025-10-07 20:26:25.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:26 np0005474864 nova_compute[192593]: 2025-10-07 20:26:26.014 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:29 np0005474864 nova_compute[192593]: 2025-10-07 20:26:29.842 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:30 np0005474864 nova_compute[192593]: 2025-10-07 20:26:30.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:30 np0005474864 podman[231309]: 2025-10-07 20:26:30.381861235 +0000 UTC m=+0.070640273 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true)
Oct  7 16:26:31 np0005474864 nova_compute[192593]: 2025-10-07 20:26:31.016 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.261 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'name': 'tempest-TestGettingAddress-server-1984576638', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000038', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '2f9bf744045540618c9980fd4a7694f5', 'user_id': '334f092941fc46c496c7def76b2cfe18', 'hostId': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.261 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.266 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 9db039b2-5ec0-490b-bb78-b07d2f6341c1 / tap72451f99-69 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.267 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.incoming.packets volume: 32 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0febeb52-0dcb-4494-8130-227e55b7b930', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 32, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.262084', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e90d913a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': 'c8bfe1a69032ad7408db97f69ffb5664d3bddab219fc27211847366f06d8f380'}]}, 'timestamp': '2025-10-07 20:26:31.268303', '_unique_id': '9ea4ad3e51554dd38b18bdac7a248cde'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.270 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.271 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.298 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.299 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'de63c609-f16d-4ab7-9d0e-628c8b69d93e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.272004', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e912653e-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.22191746, 'message_signature': 'f5c9e4511ad543c0cd2da7717212398eee56c1e96b1d4e0291b542d511bc2059'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.272004', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91272f4-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.22191746, 'message_signature': 'fcb1d2595cf3643373e81bada1da7744805c064bedb126f2e5f0806f3775d337'}]}, 'timestamp': '2025-10-07 20:26:31.299885', '_unique_id': '374813e269fa421490cc60f8cb4ac043'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.301 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.outgoing.packets volume: 31 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '425fd29d-54e0-4d5c-9a76-dcf014e263dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 31, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.302037', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e912d1ea-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '8d97e29f87a75c2051997ca061d229ead25617e1223963c2700454dea2f9a3c5'}]}, 'timestamp': '2025-10-07 20:26:31.302325', '_unique_id': 'a3f36e183ddd434fbd95dc0398032e39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.302 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.303 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.303 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '222b9307-a319-4981-a38f-724a28bca8c6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.303493', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e9130980-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '71544d2db7f4110e4d46097fa1c83d44178b401a03e9b3a90db921f16c099023'}]}, 'timestamp': '2025-10-07 20:26:31.303728', '_unique_id': 'f9744ce60c5d4e028cd6558358f51879'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.304 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.305 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>]
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.305 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.324 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/cpu volume: 10970000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f64c5850-1fa6-4a16-9924-50bbe897b915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 10970000000, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'timestamp': '2025-10-07T20:26:31.305294', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e9163a9c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.27371714, 'message_signature': '442cbdb6b37f64e33af505049b23f14252bae97440fc27188dbcd697369d4f21'}]}, 'timestamp': '2025-10-07 20:26:31.324758', '_unique_id': '7cb8e985de714336ae3ec970a32bc70b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.326 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a79eda8b-5093-4b0c-a5ae-4563042860f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.327012', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e916a14e-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': 'cbb2d3815ff76a5ddd9c5f3d090c2a07126739a812c5809fffc933247a1df248'}]}, 'timestamp': '2025-10-07 20:26:31.327313', '_unique_id': '71d203488c2e4f7ebc1a12f4b4ee48d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.327 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.328 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.328 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7209957d-75fb-4345-893d-bb87a9cb03b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.328676', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e916e1ae-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '18c25d947da1c50fc2a3dd3eb71944abd84f35006f8f65610cbbf16756e41992'}]}, 'timestamp': '2025-10-07 20:26:31.328941', '_unique_id': 'b776d3d1a16c4ce1a7a57a4df65da532'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.329 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.330 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.355 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.read.latency volume: 567062989 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.356 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.read.latency volume: 107544593 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fbe57af7-4ca4-4474-867e-6ec722394672', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 567062989, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.330221', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91b0edc-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': 'e49d613ac75d360ee9d605ef2c3e49149136eceb09e26ccdaad447a5a093a35a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 107544593, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.330221', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91b23c2-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': '6958dcd43c57c6f8693f779be1880b6dfd1d15e762aaf7a282582576d15bcf7a'}]}, 'timestamp': '2025-10-07 20:26:31.356884', '_unique_id': '3a26b7755d3c4b59919cb4190046cefa'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.358 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.359 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.359 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.write.requests volume: 321 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.359 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47767e70-a4e2-4bda-837b-c09c885ac0da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 321, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.359478', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91b964a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': 'f6342afbc3ee739e5bf828551329ec62e989025c1e2650f05d5f6099aa1550e2'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.359478', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91ba126-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': '6c3c4e8e1c6e897d376081fe3f3a930d5f9c47f9bf0a8f3a59dda83e00368391'}]}, 'timestamp': '2025-10-07 20:26:31.360029', '_unique_id': 'fa172f67b7bb4467acfaaabcbb432cab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.360 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.361 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.361 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>]
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.361 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.362 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.read.bytes volume: 31009280 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.362 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '33961ef2-d1b9-40c7-8656-20885d917768', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31009280, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.361998', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91bf81a-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': 'd02780932326e46a3cc8d1839207d329c3fef207dd4e36ba391f4aedcce14ce2'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.361998', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91c02b0-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': '6ddfa92f807adf93dc70f6638c2b7ef8b60a0113e11cb0672884f43e342ae775'}]}, 'timestamp': '2025-10-07 20:26:31.362629', '_unique_id': '4d0e53d552bd4e93ad0078034f043fdc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.363 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.364 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.364 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.write.latency volume: 3090665416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.364 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '47c7cd1d-90cc-40a6-b4fb-df5b89c582b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3090665416, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.364291', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91c52c4-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': '59484464da3319b48c9276c2d20feba16afd4ec0907f115de793f9d1708ffe54'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.364291', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91c5c56-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': 'f560da444533d52b17fe4a32d7eed10dbb37960165b0463b2ecf20ee2fc626f2'}]}, 'timestamp': '2025-10-07 20:26:31.364818', '_unique_id': '3adbb1183fe546788a47861e34110c2f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.365 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.366 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.366 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2999c78c-5ab4-4e8e-990b-c1dcfc9299d7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.366273', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e91ca012-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '35dfd56f077c55104014fdaacdd53f26dd5d924bc903338f8d6dca88b495e222'}]}, 'timestamp': '2025-10-07 20:26:31.366587', '_unique_id': '517c8176af534ef5bf31637a227a54cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.367 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.368 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.368 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.read.requests volume: 1133 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.368 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a4615072-4253-4b52-bd34-84f89ff9b0ea', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1133, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.368461', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91cf602-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': '224ed74c0a72096ce19ada50e3db4bf37c1466e73d17d64d9c70153b73252f17'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.368461', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91cffee-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': 'f2ee815a77f4072d6b508c12c2928556aa6c0f54004cbd1ed8a1b2a7e7101cdc'}]}, 'timestamp': '2025-10-07 20:26:31.369005', '_unique_id': 'b6408f103e104f7e9effeaac5e938db3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.369 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.370 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.370 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>]
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.370 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.370 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '29804f6e-e9c4-40e1-8e3b-1e19d4adbd6a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.370943', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e91d5584-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '72c50a3e5162b8cb25eb47c5cb58cfcd9813f6abc6997dc8673f2b339bd38a30'}]}, 'timestamp': '2025-10-07 20:26:31.371212', '_unique_id': 'c00f626836f042a9a34091c6603bf540'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.371 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.372 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.372 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.373 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0edcf694-7915-4b16-b2b6-e745513d06dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.372816', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91da05c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.22191746, 'message_signature': '22062c3da49e7cba97973b364841ca5126d6beaccb726733f332a0aebf62cdfc'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.372816', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91dad22-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.22191746, 'message_signature': 'ac673a538a02876ad2e7d8cc8f34935c33a684727ffa2282cf5e5107d659c575'}]}, 'timestamp': '2025-10-07 20:26:31.373474', '_unique_id': '5f539e91fe6a4f829eaf6f768f819b90'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.374 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.375 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.write.bytes volume: 72982528 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.375 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'feea3e5d-2f0c-4c28-a989-47da2ed775c8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72982528, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.375038', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91df566-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': '877f9f95848260017193f1df707b0aea19768d7c87c319700506a20b7715ccdd'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.375038', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91dff8e-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.280058632, 'message_signature': 'd22d2aeedb9b7d42ebbd0c9effc1cfcb630e5a8c04463f4cfdb24e932a050788'}]}, 'timestamp': '2025-10-07 20:26:31.375548', '_unique_id': 'cbdb2f18c9244fb184838f975145f907'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.376 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/memory.usage volume: 42.76953125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef6f161a-80c2-45dd-983e-7fe1876cb891', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.76953125, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'timestamp': '2025-10-07T20:26:31.377028', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e91e432c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.27371714, 'message_signature': 'ce69e889b702c70f34cbd73c35d12d64ba052d4857ad002e468655e550bb866c'}]}, 'timestamp': '2025-10-07 20:26:31.377304', '_unique_id': 'fdec429d14cc465c8d1a36111c8d06a9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.377 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.378 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.378 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '307654a0-b083-4a8f-8217-c9f024a1b111', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-vda', 'timestamp': '2025-10-07T20:26:31.378762', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e91e86de-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.22191746, 'message_signature': '15d655bdb54352ec0552b60a380645f176e6c5156ab4235441f1344535965d4a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1-sda', 'timestamp': '2025-10-07T20:26:31.378762', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'instance-00000038', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e91e9002-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.22191746, 'message_signature': 'bf706c4a408adbd218736213304ce79e3fc3640765b80333916ff549e786a59f'}]}, 'timestamp': '2025-10-07 20:26:31.379260', '_unique_id': 'f0d0be1958384626ac8c5072eabf8b52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.379 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.380 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.380 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.381 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1984576638>]
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.381 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.381 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.outgoing.bytes volume: 3704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c8b165d3-428b-4dc1-9f16-882d82cd5459', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 3704, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.381409', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e91eee6c-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '665b4074d1a84c13d7769539618d1bbf986edb69e6238ee207e51a0a8250470e'}]}, 'timestamp': '2025-10-07 20:26:31.381677', '_unique_id': 'b911981a76ac4b87878a06e86129e5ef'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.382 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.383 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.383 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '83c96587-49fb-4c17-a3d9-637b876020ec', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.383302', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e91f385e-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '9574580d0ea769d3f6eef4d0316a04d205651293974810529e1b5be3d2b585e1'}]}, 'timestamp': '2025-10-07 20:26:31.383571', '_unique_id': 'bc1623f2b0fe4668b11e9f87ae312acc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.384 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 DEBUG ceilometer.compute.pollsters [-] 9db039b2-5ec0-490b-bb78-b07d2f6341c1/network.incoming.bytes volume: 4879 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e8f81cc-f965-421c-82f4-fafb0eecba6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 4879, 'user_id': '334f092941fc46c496c7def76b2cfe18', 'user_name': None, 'project_id': '2f9bf744045540618c9980fd4a7694f5', 'project_name': None, 'resource_id': 'instance-00000038-9db039b2-5ec0-490b-bb78-b07d2f6341c1-tap72451f99-69', 'timestamp': '2025-10-07T20:26:31.385091', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1984576638', 'name': 'tap72451f99-69', 'instance_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'instance_type': 'm1.nano', 'host': '9682c26e83bdb3c6d6e608ab584f0ee0f1f91259d9b4cb178ec77a18', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '3fec056a-1226-48ad-a02c-e4fe097a9363', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '3c70ce5f-6f9a-4def-9c79-e5a33d631679'}, 'image_ref': '3c70ce5f-6f9a-4def-9c79-e5a33d631679', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:0e:98:a1', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap72451f99-69'}, 'message_id': 'e91f7f26-a3bb-11f0-9441-fa163e5cce8e', 'monotonic_time': 4427.211900292, 'message_signature': '8e0d360483fcd05934c5c05fd715efb83fb83259de1fe25186979976e59f125e'}]}, 'timestamp': '2025-10-07 20:26:31.385383', '_unique_id': '87cf78310764438a8d2160152ab22b02'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     yield
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Oct  7 16:26:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:26:31.385 12 ERROR oslo_messaging.notify.messaging 
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.128 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.130 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.215 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.274 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.276 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.338 2 DEBUG oslo_concurrency.processutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.518 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.519 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5583MB free_disk=73.42593383789062GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.519 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.520 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.777 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Instance 9db039b2-5ec0-490b-bb78-b07d2f6341c1 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.777 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.778 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.835 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.855 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.885 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:26:32 np0005474864 nova_compute[192593]: 2025-10-07 20:26:32.885 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.366s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:34 np0005474864 podman[231337]: 2025-10-07 20:26:34.361356366 +0000 UTC m=+0.060489761 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:26:34 np0005474864 nova_compute[192593]: 2025-10-07 20:26:34.846 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:35 np0005474864 nova_compute[192593]: 2025-10-07 20:26:35.886 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:36 np0005474864 nova_compute[192593]: 2025-10-07 20:26:36.019 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:37 np0005474864 nova_compute[192593]: 2025-10-07 20:26:37.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:37 np0005474864 nova_compute[192593]: 2025-10-07 20:26:37.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:26:37 np0005474864 nova_compute[192593]: 2025-10-07 20:26:37.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:26:37 np0005474864 nova_compute[192593]: 2025-10-07 20:26:37.874 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:26:37 np0005474864 nova_compute[192593]: 2025-10-07 20:26:37.875 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquired lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:26:37 np0005474864 nova_compute[192593]: 2025-10-07 20:26:37.875 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Oct  7 16:26:37 np0005474864 nova_compute[192593]: 2025-10-07 20:26:37.876 2 DEBUG nova.objects.instance [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lazy-loading 'info_cache' on Instance uuid 9db039b2-5ec0-490b-bb78-b07d2f6341c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:26:39 np0005474864 podman[231361]: 2025-10-07 20:26:39.410058023 +0000 UTC m=+0.101998845 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Oct  7 16:26:39 np0005474864 nova_compute[192593]: 2025-10-07 20:26:39.849 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:40 np0005474864 nova_compute[192593]: 2025-10-07 20:26:40.215 2 DEBUG nova.network.neutron [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updating instance_info_cache with network_info: [{"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:26:40 np0005474864 nova_compute[192593]: 2025-10-07 20:26:40.229 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Releasing lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:26:40 np0005474864 nova_compute[192593]: 2025-10-07 20:26:40.229 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Oct  7 16:26:40 np0005474864 nova_compute[192593]: 2025-10-07 20:26:40.230 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:40 np0005474864 nova_compute[192593]: 2025-10-07 20:26:40.231 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:40 np0005474864 nova_compute[192593]: 2025-10-07 20:26:40.231 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:26:40 np0005474864 nova_compute[192593]: 2025-10-07 20:26:40.232 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:26:41 np0005474864 nova_compute[192593]: 2025-10-07 20:26:41.022 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:43 np0005474864 nova_compute[192593]: 2025-10-07 20:26:43.931 2 DEBUG nova.compute.manager [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-changed-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:26:43 np0005474864 nova_compute[192593]: 2025-10-07 20:26:43.931 2 DEBUG nova.compute.manager [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Refreshing instance network info cache due to event network-changed-72451f99-693c-42bb-bddb-f2b4b09ac4fd. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Oct  7 16:26:43 np0005474864 nova_compute[192593]: 2025-10-07 20:26:43.932 2 DEBUG oslo_concurrency.lockutils [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Oct  7 16:26:43 np0005474864 nova_compute[192593]: 2025-10-07 20:26:43.933 2 DEBUG oslo_concurrency.lockutils [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquired lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Oct  7 16:26:43 np0005474864 nova_compute[192593]: 2025-10-07 20:26:43.933 2 DEBUG nova.network.neutron [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Refreshing network info cache for port 72451f99-693c-42bb-bddb-f2b4b09ac4fd _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.023 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.024 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.025 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.025 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.025 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.027 2 INFO nova.compute.manager [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Terminating instance#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.029 2 DEBUG nova.compute.manager [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Oct  7 16:26:44 np0005474864 kernel: tap72451f99-69 (unregistering): left promiscuous mode
Oct  7 16:26:44 np0005474864 NetworkManager[51631]: <info>  [1759868804.0777] device (tap72451f99-69): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Oct  7 16:26:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:26:44Z|00295|binding|INFO|Releasing lport 72451f99-693c-42bb-bddb-f2b4b09ac4fd from this chassis (sb_readonly=0)
Oct  7 16:26:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:26:44Z|00296|binding|INFO|Setting lport 72451f99-693c-42bb-bddb-f2b4b09ac4fd down in Southbound
Oct  7 16:26:44 np0005474864 ovn_controller[94801]: 2025-10-07T20:26:44Z|00297|binding|INFO|Removing iface tap72451f99-69 ovn-installed in OVS
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.088 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.095 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:0e:98:a1 10.100.0.7 2001:db8::f816:3eff:fe0e:98a1'], port_security=['fa:16:3e:0e:98:a1 10.100.0.7 2001:db8::f816:3eff:fe0e:98a1'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28 2001:db8::f816:3eff:fe0e:98a1/64', 'neutron:device_id': '9db039b2-5ec0-490b-bb78-b07d2f6341c1', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '2f9bf744045540618c9980fd4a7694f5', 'neutron:revision_number': '4', 'neutron:security_group_ids': '6b3e328b-3967-4ad5-a38e-2a20145637ec', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=e00660f2-0bb7-4ca4-9bc4-6a9508a9764e, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>], logical_port=72451f99-693c-42bb-bddb-f2b4b09ac4fd) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1427a62820>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.096 103685 INFO neutron.agent.ovn.metadata.agent [-] Port 72451f99-693c-42bb-bddb-f2b4b09ac4fd in datapath b5fbe464-abae-4582-802d-66b1adc9bc5d unbound from our chassis#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.097 103685 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b5fbe464-abae-4582-802d-66b1adc9bc5d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.098 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[4f69f8a0-0de5-4fec-88fd-bf96414c5528]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.098 103685 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d namespace which is not needed anymore#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.117 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000038.scope: Deactivated successfully.
Oct  7 16:26:44 np0005474864 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d00000038.scope: Consumed 14.955s CPU time.
Oct  7 16:26:44 np0005474864 systemd-machined[152586]: Machine qemu-20-instance-00000038 terminated.
Oct  7 16:26:44 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [NOTICE]   (231003) : haproxy version is 2.8.14-c23fe91
Oct  7 16:26:44 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [NOTICE]   (231003) : path to executable is /usr/sbin/haproxy
Oct  7 16:26:44 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [WARNING]  (231003) : Exiting Master process...
Oct  7 16:26:44 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [WARNING]  (231003) : Exiting Master process...
Oct  7 16:26:44 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [ALERT]    (231003) : Current worker (231005) exited with code 143 (Terminated)
Oct  7 16:26:44 np0005474864 neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d[230999]: [WARNING]  (231003) : All workers exited. Exiting... (0)
Oct  7 16:26:44 np0005474864 systemd[1]: libpod-c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839.scope: Deactivated successfully.
Oct  7 16:26:44 np0005474864 podman[231406]: 2025-10-07 20:26:44.247509822 +0000 UTC m=+0.048933039 container died c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:26:44 np0005474864 NetworkManager[51631]: <info>  [1759868804.2496] manager: (tap72451f99-69): new Tun device (/org/freedesktop/NetworkManager/Devices/155)
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.251 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.258 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839-userdata-shm.mount: Deactivated successfully.
Oct  7 16:26:44 np0005474864 systemd[1]: var-lib-containers-storage-overlay-bd2945e0d490cd27160323285c6b242905bba1175b5625f732a99474c8eadd85-merged.mount: Deactivated successfully.
Oct  7 16:26:44 np0005474864 podman[231406]: 2025-10-07 20:26:44.288070288 +0000 UTC m=+0.089493525 container cleanup c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.303 2 INFO nova.virt.libvirt.driver [-] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Instance destroyed successfully.#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.304 2 DEBUG nova.objects.instance [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lazy-loading 'resources' on Instance uuid 9db039b2-5ec0-490b-bb78-b07d2f6341c1 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Oct  7 16:26:44 np0005474864 systemd[1]: libpod-conmon-c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839.scope: Deactivated successfully.
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.322 2 DEBUG nova.virt.libvirt.vif [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-10-07T20:25:32Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1984576638',display_name='tempest-TestGettingAddress-server-1984576638',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1984576638',id=56,image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFXMC9BHV4qYivbDJxGRtnm/v2GS7hcRhCb8qo5+Gp9JpLYRJPsnY0G6yPoLD7SXJAgR2NH3iQzGauZI0CtfWyA4PpJm+ved5s7oNOrv+n1zHgrMx00oWwi21LGuUsNrTw==',key_name='tempest-TestGettingAddress-422350208',keypairs=<?>,launch_index=0,launched_at=2025-10-07T20:25:43Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='2f9bf744045540618c9980fd4a7694f5',ramdisk_id='',reservation_id='r-25nx8bcz',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='3c70ce5f-6f9a-4def-9c79-e5a33d631679',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-626136673',owner_user_name='tempest-TestGettingAddress-626136673-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-10-07T20:25:43Z,user_data=None,user_id='334f092941fc46c496c7def76b2cfe18',uuid=9db039b2-5ec0-490b-bb78-b07d2f6341c1,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.323 2 DEBUG nova.network.os_vif_util [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converting VIF {"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.237", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.324 2 DEBUG nova.network.os_vif_util [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:0e:98:a1,bridge_name='br-int',has_traffic_filtering=True,id=72451f99-693c-42bb-bddb-f2b4b09ac4fd,network=Network(b5fbe464-abae-4582-802d-66b1adc9bc5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72451f99-69') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.324 2 DEBUG os_vif [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:98:a1,bridge_name='br-int',has_traffic_filtering=True,id=72451f99-693c-42bb-bddb-f2b4b09ac4fd,network=Network(b5fbe464-abae-4582-802d-66b1adc9bc5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72451f99-69') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.326 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 21 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.326 2 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap72451f99-69, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.332 2 INFO os_vif [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:0e:98:a1,bridge_name='br-int',has_traffic_filtering=True,id=72451f99-693c-42bb-bddb-f2b4b09ac4fd,network=Network(b5fbe464-abae-4582-802d-66b1adc9bc5d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap72451f99-69')#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.333 2 INFO nova.virt.libvirt.driver [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Deleting instance files /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1_del#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.334 2 INFO nova.virt.libvirt.driver [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Deletion of /var/lib/nova/instances/9db039b2-5ec0-490b-bb78-b07d2f6341c1_del complete#033[00m
Oct  7 16:26:44 np0005474864 podman[231450]: 2025-10-07 20:26:44.383099862 +0000 UTC m=+0.062398046 container remove c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.390 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[0e97402a-5187-4722-b434-d6ab03aea885]: (4, ('Tue Oct  7 08:26:44 PM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d (c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839)\nc4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839\nTue Oct  7 08:26:44 PM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d (c4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839)\nc4a7cb0dd099ef4db5f6c5d76676f73f5d64460adf0fd323bfc75e36953bc839\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.391 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[969c62f3-1b46-43d6-8140-d7cd0b8bf69e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.392 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb5fbe464-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.393 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 kernel: tapb5fbe464-a0: left promiscuous mode
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.405 2 INFO nova.compute.manager [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.405 2 DEBUG oslo.service.loopingcall [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.406 2 DEBUG nova.compute.manager [-] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.406 2 DEBUG nova.network.neutron [-] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.416 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.420 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[5285964a-6785-490a-9c05-0bbc00d6ef23]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.448 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[6775facf-86d6-47be-9998-378229f07a41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.450 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[2a47e6b7-65ac-4fcd-8fde-a7142fa5545a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.471 220243 DEBUG oslo.privsep.daemon [-] privsep: reply[d528d1d2-642a-4402-a1f3-e7008977796a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 437832, 'reachable_time': 26996, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 231469, 'error': None, 'target': 'ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.474 103797 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b5fbe464-abae-4582-802d-66b1adc9bc5d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Oct  7 16:26:44 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:26:44.474 103797 DEBUG oslo.privsep.daemon [-] privsep: reply[f81fcf0c-986e-4ca2-a2fc-7d636cc76e50]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Oct  7 16:26:44 np0005474864 systemd[1]: run-netns-ovnmeta\x2db5fbe464\x2dabae\x2d4582\x2d802d\x2d66b1adc9bc5d.mount: Deactivated successfully.
Oct  7 16:26:44 np0005474864 nova_compute[192593]: 2025-10-07 20:26:44.848 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.170 2 DEBUG nova.network.neutron [-] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.199 2 INFO nova.compute.manager [-] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Took 0.79 seconds to deallocate network for instance.#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.242 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.242 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.302 2 DEBUG nova.compute.provider_tree [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.321 2 DEBUG nova.scheduler.client.report [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.344 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.375 2 INFO nova.scheduler.client.report [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Deleted allocations for instance 9db039b2-5ec0-490b-bb78-b07d2f6341c1#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.418 2 DEBUG nova.network.neutron [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updated VIF entry in instance network info cache for port 72451f99-693c-42bb-bddb-f2b4b09ac4fd. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.419 2 DEBUG nova.network.neutron [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Updating instance_info_cache with network_info: [{"id": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "address": "fa:16:3e:0e:98:a1", "network": {"id": "b5fbe464-abae-4582-802d-66b1adc9bc5d", "bridge": "br-int", "label": "tempest-network-smoke--169082249", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe0e:98a1", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "2f9bf744045540618c9980fd4a7694f5", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap72451f99-69", "ovs_interfaceid": "72451f99-693c-42bb-bddb-f2b4b09ac4fd", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.436 2 DEBUG oslo_concurrency.lockutils [None req-1ff68218-f73d-4c70-96b9-e64b63cace4f 334f092941fc46c496c7def76b2cfe18 2f9bf744045540618c9980fd4a7694f5 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.411s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:45 np0005474864 nova_compute[192593]: 2025-10-07 20:26:45.438 2 DEBUG oslo_concurrency.lockutils [req-ff4b48a8-5a24-44b4-921a-f8eb4c7ad26a req-7f75fd18-d7ef-4ab6-b25a-8f316d8464e2 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Releasing lock "refresh_cache-9db039b2-5ec0-490b-bb78-b07d2f6341c1" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.300 2 DEBUG nova.compute.manager [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-vif-unplugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.301 2 DEBUG oslo_concurrency.lockutils [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.301 2 DEBUG oslo_concurrency.lockutils [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.302 2 DEBUG oslo_concurrency.lockutils [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.302 2 DEBUG nova.compute.manager [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] No waiting events found dispatching network-vif-unplugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.303 2 WARNING nova.compute.manager [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received unexpected event network-vif-unplugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.303 2 DEBUG nova.compute.manager [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.303 2 DEBUG oslo_concurrency.lockutils [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Acquiring lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.304 2 DEBUG oslo_concurrency.lockutils [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.304 2 DEBUG oslo_concurrency.lockutils [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] Lock "9db039b2-5ec0-490b-bb78-b07d2f6341c1-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.305 2 DEBUG nova.compute.manager [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] No waiting events found dispatching network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.305 2 WARNING nova.compute.manager [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received unexpected event network-vif-plugged-72451f99-693c-42bb-bddb-f2b4b09ac4fd for instance with vm_state deleted and task_state None.#033[00m
Oct  7 16:26:46 np0005474864 nova_compute[192593]: 2025-10-07 20:26:46.306 2 DEBUG nova.compute.manager [req-ffcd16dd-e774-4550-991f-2c452d2d1a78 req-5cc3acd4-2064-4fa2-a7ec-3ed4d79057ef 2830f402a62042f29bf1a401d2b71c8d 1c48104a1c5f49eeacf02a66132a67d4 - - default default] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Received event network-vif-deleted-72451f99-693c-42bb-bddb-f2b4b09ac4fd external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Oct  7 16:26:49 np0005474864 nova_compute[192593]: 2025-10-07 20:26:49.328 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:49 np0005474864 nova_compute[192593]: 2025-10-07 20:26:49.851 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:52 np0005474864 podman[231471]: 2025-10-07 20:26:52.403546629 +0000 UTC m=+0.082039221 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, architecture=x86_64, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, managed_by=edpm_ansible, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-type=git)
Oct  7 16:26:52 np0005474864 podman[231470]: 2025-10-07 20:26:52.434566162 +0000 UTC m=+0.116427250 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:26:54 np0005474864 nova_compute[192593]: 2025-10-07 20:26:54.331 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:54 np0005474864 nova_compute[192593]: 2025-10-07 20:26:54.871 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:54 np0005474864 nova_compute[192593]: 2025-10-07 20:26:54.931 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:54 np0005474864 nova_compute[192593]: 2025-10-07 20:26:54.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:56 np0005474864 podman[231515]: 2025-10-07 20:26:56.377403167 +0000 UTC m=+0.070999994 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  7 16:26:56 np0005474864 podman[231517]: 2025-10-07 20:26:56.387155927 +0000 UTC m=+0.068522452 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:26:56 np0005474864 podman[231516]: 2025-10-07 20:26:56.414214066 +0000 UTC m=+0.101429429 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:26:59 np0005474864 nova_compute[192593]: 2025-10-07 20:26:59.301 2 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1759868804.3007379, 9db039b2-5ec0-490b-bb78-b07d2f6341c1 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Oct  7 16:26:59 np0005474864 nova_compute[192593]: 2025-10-07 20:26:59.302 2 INFO nova.compute.manager [-] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] VM Stopped (Lifecycle Event)#033[00m
Oct  7 16:26:59 np0005474864 nova_compute[192593]: 2025-10-07 20:26:59.324 2 DEBUG nova.compute.manager [None req-b240f941-5ced-480d-a25e-4fad7aac6f50 - - - - - -] [instance: 9db039b2-5ec0-490b-bb78-b07d2f6341c1] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Oct  7 16:26:59 np0005474864 nova_compute[192593]: 2025-10-07 20:26:59.332 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:26:59 np0005474864 nova_compute[192593]: 2025-10-07 20:26:59.935 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:01 np0005474864 podman[231579]: 2025-10-07 20:27:01.370935995 +0000 UTC m=+0.061119749 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Oct  7 16:27:04 np0005474864 nova_compute[192593]: 2025-10-07 20:27:04.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:04 np0005474864 nova_compute[192593]: 2025-10-07 20:27:04.938 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:05 np0005474864 podman[231598]: 2025-10-07 20:27:05.382443726 +0000 UTC m=+0.073178696 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:27:09 np0005474864 nova_compute[192593]: 2025-10-07 20:27:09.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:09 np0005474864 nova_compute[192593]: 2025-10-07 20:27:09.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:10 np0005474864 podman[231623]: 2025-10-07 20:27:10.422572094 +0000 UTC m=+0.091189104 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute)
Oct  7 16:27:14 np0005474864 nova_compute[192593]: 2025-10-07 20:27:14.340 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:14 np0005474864 nova_compute[192593]: 2025-10-07 20:27:14.947 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:27:16.203 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:27:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:27:16.203 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:27:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:27:16.204 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:27:19 np0005474864 nova_compute[192593]: 2025-10-07 20:27:19.342 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:19 np0005474864 nova_compute[192593]: 2025-10-07 20:27:19.950 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:23 np0005474864 podman[231645]: 2025-10-07 20:27:23.365447013 +0000 UTC m=+0.060076679 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public)
Oct  7 16:27:23 np0005474864 podman[231644]: 2025-10-07 20:27:23.37964032 +0000 UTC m=+0.076868251 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:27:24 np0005474864 nova_compute[192593]: 2025-10-07 20:27:24.345 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:24 np0005474864 nova_compute[192593]: 2025-10-07 20:27:24.954 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:27 np0005474864 ovn_controller[94801]: 2025-10-07T20:27:27Z|00298|memory_trim|INFO|Detected inactivity (last active 30022 ms ago): trimming memory
Oct  7 16:27:27 np0005474864 podman[231684]: 2025-10-07 20:27:27.399376677 +0000 UTC m=+0.085477580 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_id=iscsid)
Oct  7 16:27:27 np0005474864 podman[231686]: 2025-10-07 20:27:27.452708091 +0000 UTC m=+0.119639052 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_id=multipathd, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  7 16:27:27 np0005474864 podman[231685]: 2025-10-07 20:27:27.457485569 +0000 UTC m=+0.137902508 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true)
Oct  7 16:27:29 np0005474864 nova_compute[192593]: 2025-10-07 20:27:29.346 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:29 np0005474864 nova_compute[192593]: 2025-10-07 20:27:29.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.095 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.130 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.130 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.131 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:27:32 np0005474864 podman[231749]: 2025-10-07 20:27:32.317418734 +0000 UTC m=+0.121966340 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=ovn_metadata_agent, io.buildah.version=1.41.3)
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.402 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.403 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5759MB free_disk=73.45463943481445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.403 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.403 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.479 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.480 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.508 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.524 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.557 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:27:32 np0005474864 nova_compute[192593]: 2025-10-07 20:27:32.557 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.154s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:27:34 np0005474864 nova_compute[192593]: 2025-10-07 20:27:34.349 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:34 np0005474864 nova_compute[192593]: 2025-10-07 20:27:34.961 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:36 np0005474864 podman[231769]: 2025-10-07 20:27:36.378085129 +0000 UTC m=+0.070805548 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:27:37 np0005474864 nova_compute[192593]: 2025-10-07 20:27:37.556 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.113 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.114 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.114 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.137 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.138 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.138 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.351 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:39 np0005474864 nova_compute[192593]: 2025-10-07 20:27:39.963 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:40 np0005474864 nova_compute[192593]: 2025-10-07 20:27:40.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:40 np0005474864 systemd-logind[805]: New session 30 of user zuul.
Oct  7 16:27:40 np0005474864 systemd[1]: Started Session 30 of User zuul.
Oct  7 16:27:40 np0005474864 podman[231797]: 2025-10-07 20:27:40.968151371 +0000 UTC m=+0.073261318 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:27:42 np0005474864 nova_compute[192593]: 2025-10-07 20:27:42.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:27:44 np0005474864 nova_compute[192593]: 2025-10-07 20:27:44.355 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:44 np0005474864 nova_compute[192593]: 2025-10-07 20:27:44.966 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:45 np0005474864 ovs-vsctl[231992]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  7 16:27:46 np0005474864 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 231844 (sos)
Oct  7 16:27:46 np0005474864 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Oct  7 16:27:46 np0005474864 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Oct  7 16:27:46 np0005474864 virtqemud[192092]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  7 16:27:46 np0005474864 virtqemud[192092]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  7 16:27:46 np0005474864 virtqemud[192092]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  7 16:27:47 np0005474864 kernel: block vda: the capability attribute has been deprecated.
Oct  7 16:27:49 np0005474864 nova_compute[192593]: 2025-10-07 20:27:49.356 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:49 np0005474864 nova_compute[192593]: 2025-10-07 20:27:49.968 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:50 np0005474864 systemd[1]: Starting Hostname Service...
Oct  7 16:27:50 np0005474864 systemd[1]: Started Hostname Service.
Oct  7 16:27:54 np0005474864 nova_compute[192593]: 2025-10-07 20:27:54.358 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:54 np0005474864 podman[232872]: 2025-10-07 20:27:54.370883048 +0000 UTC m=+0.063225330 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vcs-type=git, release=1755695350, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9)
Oct  7 16:27:54 np0005474864 podman[232870]: 2025-10-07 20:27:54.400279633 +0000 UTC m=+0.092568023 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:27:54 np0005474864 nova_compute[192593]: 2025-10-07 20:27:54.970 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:56 np0005474864 ovs-appctl[233641]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  7 16:27:56 np0005474864 ovs-appctl[233650]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  7 16:27:56 np0005474864 ovs-appctl[233663]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Oct  7 16:27:58 np0005474864 podman[234312]: 2025-10-07 20:27:58.401509428 +0000 UTC m=+0.093561562 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Oct  7 16:27:58 np0005474864 podman[234317]: 2025-10-07 20:27:58.40364867 +0000 UTC m=+0.094672135 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  7 16:27:58 np0005474864 podman[234315]: 2025-10-07 20:27:58.414419119 +0000 UTC m=+0.106536665 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller)
Oct  7 16:27:59 np0005474864 nova_compute[192593]: 2025-10-07 20:27:59.360 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:27:59 np0005474864 nova_compute[192593]: 2025-10-07 20:27:59.971 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:02 np0005474864 podman[234776]: 2025-10-07 20:28:02.497164202 +0000 UTC m=+0.085360995 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:28:04 np0005474864 virtqemud[192092]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  7 16:28:04 np0005474864 nova_compute[192593]: 2025-10-07 20:28:04.362 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:04 np0005474864 nova_compute[192593]: 2025-10-07 20:28:04.973 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:06 np0005474864 systemd[1]: Starting Time & Date Service...
Oct  7 16:28:06 np0005474864 systemd[1]: Started Time & Date Service.
Oct  7 16:28:06 np0005474864 podman[235264]: 2025-10-07 20:28:06.490954993 +0000 UTC m=+0.067204139 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:28:09 np0005474864 nova_compute[192593]: 2025-10-07 20:28:09.392 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:09 np0005474864 nova_compute[192593]: 2025-10-07 20:28:09.974 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:11 np0005474864 podman[235288]: 2025-10-07 20:28:11.366525574 +0000 UTC m=+0.066258381 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  7 16:28:14 np0005474864 nova_compute[192593]: 2025-10-07 20:28:14.394 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:14 np0005474864 nova_compute[192593]: 2025-10-07 20:28:14.978 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:28:16.205 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:28:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:28:16.205 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:28:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:28:16.205 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:28:19 np0005474864 nova_compute[192593]: 2025-10-07 20:28:19.397 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:19 np0005474864 nova_compute[192593]: 2025-10-07 20:28:19.980 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:24 np0005474864 nova_compute[192593]: 2025-10-07 20:28:24.400 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:24 np0005474864 nova_compute[192593]: 2025-10-07 20:28:24.983 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:25 np0005474864 podman[235326]: 2025-10-07 20:28:25.366768488 +0000 UTC m=+0.054865662 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., architecture=x86_64, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Oct  7 16:28:25 np0005474864 podman[235325]: 2025-10-07 20:28:25.393361139 +0000 UTC m=+0.091757461 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Oct  7 16:28:26 np0005474864 systemd[1]: session-30.scope: Deactivated successfully.
Oct  7 16:28:26 np0005474864 systemd[1]: session-30.scope: Consumed 1min 15.299s CPU time, 526.0M memory peak, read 107.4M from disk, written 21.4M to disk.
Oct  7 16:28:26 np0005474864 systemd-logind[805]: Session 30 logged out. Waiting for processes to exit.
Oct  7 16:28:26 np0005474864 systemd-logind[805]: Removed session 30.
Oct  7 16:28:26 np0005474864 systemd-logind[805]: New session 31 of user zuul.
Oct  7 16:28:26 np0005474864 systemd[1]: Started Session 31 of User zuul.
Oct  7 16:28:27 np0005474864 systemd[1]: session-31.scope: Deactivated successfully.
Oct  7 16:28:27 np0005474864 systemd-logind[805]: Session 31 logged out. Waiting for processes to exit.
Oct  7 16:28:27 np0005474864 systemd-logind[805]: Removed session 31.
Oct  7 16:28:27 np0005474864 systemd-logind[805]: New session 32 of user zuul.
Oct  7 16:28:27 np0005474864 systemd[1]: Started Session 32 of User zuul.
Oct  7 16:28:28 np0005474864 systemd[1]: session-32.scope: Deactivated successfully.
Oct  7 16:28:28 np0005474864 systemd-logind[805]: Session 32 logged out. Waiting for processes to exit.
Oct  7 16:28:28 np0005474864 systemd-logind[805]: Removed session 32.
Oct  7 16:28:29 np0005474864 podman[235431]: 2025-10-07 20:28:29.367144881 +0000 UTC m=+0.059806594 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, managed_by=edpm_ansible)
Oct  7 16:28:29 np0005474864 podman[235429]: 2025-10-07 20:28:29.394118583 +0000 UTC m=+0.086730505 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Oct  7 16:28:29 np0005474864 nova_compute[192593]: 2025-10-07 20:28:29.401 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:29 np0005474864 podman[235430]: 2025-10-07 20:28:29.419107497 +0000 UTC m=+0.111719409 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Oct  7 16:28:29 np0005474864 nova_compute[192593]: 2025-10-07 20:28:29.985 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.258 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:28:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.128 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.129 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.129 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.312 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.313 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5627MB free_disk=73.45404434204102GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.313 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.314 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.660 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.661 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.697 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.737 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.739 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:28:32 np0005474864 nova_compute[192593]: 2025-10-07 20:28:32.739 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.426s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:28:33 np0005474864 podman[235491]: 2025-10-07 20:28:33.415757262 +0000 UTC m=+0.096008464 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:28:33 np0005474864 nova_compute[192593]: 2025-10-07 20:28:33.735 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:33 np0005474864 nova_compute[192593]: 2025-10-07 20:28:33.736 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:34 np0005474864 nova_compute[192593]: 2025-10-07 20:28:34.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:34 np0005474864 nova_compute[192593]: 2025-10-07 20:28:34.403 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:34 np0005474864 nova_compute[192593]: 2025-10-07 20:28:34.988 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:36 np0005474864 systemd[1]: systemd-timedated.service: Deactivated successfully.
Oct  7 16:28:36 np0005474864 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Oct  7 16:28:37 np0005474864 podman[235514]: 2025-10-07 20:28:37.382960713 +0000 UTC m=+0.069940488 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:28:39 np0005474864 nova_compute[192593]: 2025-10-07 20:28:39.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:39 np0005474864 nova_compute[192593]: 2025-10-07 20:28:39.405 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:39 np0005474864 nova_compute[192593]: 2025-10-07 20:28:39.993 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:40 np0005474864 nova_compute[192593]: 2025-10-07 20:28:40.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:41 np0005474864 nova_compute[192593]: 2025-10-07 20:28:41.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:41 np0005474864 nova_compute[192593]: 2025-10-07 20:28:41.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:28:41 np0005474864 nova_compute[192593]: 2025-10-07 20:28:41.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:28:41 np0005474864 nova_compute[192593]: 2025-10-07 20:28:41.115 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:28:41 np0005474864 nova_compute[192593]: 2025-10-07 20:28:41.116 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:41 np0005474864 nova_compute[192593]: 2025-10-07 20:28:41.116 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:28:42 np0005474864 podman[235538]: 2025-10-07 20:28:42.397499431 +0000 UTC m=+0.091896165 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:28:43 np0005474864 nova_compute[192593]: 2025-10-07 20:28:43.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:44 np0005474864 nova_compute[192593]: 2025-10-07 20:28:44.408 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:44 np0005474864 nova_compute[192593]: 2025-10-07 20:28:44.994 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:45 np0005474864 nova_compute[192593]: 2025-10-07 20:28:45.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:45 np0005474864 nova_compute[192593]: 2025-10-07 20:28:45.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  7 16:28:49 np0005474864 nova_compute[192593]: 2025-10-07 20:28:49.410 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:49 np0005474864 nova_compute[192593]: 2025-10-07 20:28:49.995 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:52 np0005474864 nova_compute[192593]: 2025-10-07 20:28:52.110 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:52 np0005474864 nova_compute[192593]: 2025-10-07 20:28:52.110 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  7 16:28:52 np0005474864 nova_compute[192593]: 2025-10-07 20:28:52.128 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  7 16:28:54 np0005474864 nova_compute[192593]: 2025-10-07 20:28:54.412 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:55 np0005474864 nova_compute[192593]: 2025-10-07 20:28:54.999 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:28:55 np0005474864 nova_compute[192593]: 2025-10-07 20:28:55.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:28:56 np0005474864 podman[235558]: 2025-10-07 20:28:56.401787302 +0000 UTC m=+0.088907928 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:28:56 np0005474864 podman[235559]: 2025-10-07 20:28:56.411929726 +0000 UTC m=+0.098925689 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, build-date=2025-08-20T13:12:41, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, vendor=Red Hat, Inc., io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, distribution-scope=public)
Oct  7 16:28:59 np0005474864 nova_compute[192593]: 2025-10-07 20:28:59.414 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:00 np0005474864 nova_compute[192593]: 2025-10-07 20:29:00.001 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:00 np0005474864 podman[235603]: 2025-10-07 20:29:00.398496838 +0000 UTC m=+0.083732218 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']})
Oct  7 16:29:00 np0005474864 podman[235605]: 2025-10-07 20:29:00.422623267 +0000 UTC m=+0.097012333 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_managed=true)
Oct  7 16:29:00 np0005474864 podman[235604]: 2025-10-07 20:29:00.422889955 +0000 UTC m=+0.111567355 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001)
Oct  7 16:29:04 np0005474864 podman[235672]: 2025-10-07 20:29:04.365211007 +0000 UTC m=+0.062047209 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:29:04 np0005474864 nova_compute[192593]: 2025-10-07 20:29:04.417 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:05 np0005474864 nova_compute[192593]: 2025-10-07 20:29:05.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:08 np0005474864 podman[235691]: 2025-10-07 20:29:08.387592336 +0000 UTC m=+0.067185288 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:29:09 np0005474864 nova_compute[192593]: 2025-10-07 20:29:09.419 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:10 np0005474864 nova_compute[192593]: 2025-10-07 20:29:10.006 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:13 np0005474864 podman[235715]: 2025-10-07 20:29:13.392707892 +0000 UTC m=+0.080753472 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:29:14 np0005474864 nova_compute[192593]: 2025-10-07 20:29:14.422 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:15 np0005474864 nova_compute[192593]: 2025-10-07 20:29:15.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:29:16.206 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:29:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:29:16.208 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:29:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:29:16.208 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:29:19 np0005474864 nova_compute[192593]: 2025-10-07 20:29:19.424 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:20 np0005474864 nova_compute[192593]: 2025-10-07 20:29:20.010 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:24 np0005474864 nova_compute[192593]: 2025-10-07 20:29:24.427 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:25 np0005474864 nova_compute[192593]: 2025-10-07 20:29:25.013 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:27 np0005474864 podman[235736]: 2025-10-07 20:29:27.364043787 +0000 UTC m=+0.056743826 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, vendor=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  7 16:29:27 np0005474864 podman[235735]: 2025-10-07 20:29:27.380339609 +0000 UTC m=+0.067784456 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:29:29 np0005474864 nova_compute[192593]: 2025-10-07 20:29:29.429 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:30 np0005474864 nova_compute[192593]: 2025-10-07 20:29:30.015 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:31 np0005474864 podman[235780]: 2025-10-07 20:29:31.415928383 +0000 UTC m=+0.101102051 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=iscsid, container_name=iscsid, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:29:31 np0005474864 podman[235782]: 2025-10-07 20:29:31.432237236 +0000 UTC m=+0.102265495 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Oct  7 16:29:31 np0005474864 podman[235781]: 2025-10-07 20:29:31.478667192 +0000 UTC m=+0.151142852 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, tcib_managed=true, container_name=ovn_controller)
Oct  7 16:29:33 np0005474864 nova_compute[192593]: 2025-10-07 20:29:33.116 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.136 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.137 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.138 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.139 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.369 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.371 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5682MB free_disk=73.45463943481445GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.371 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.372 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.530 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.530 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.560 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.577 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.579 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:29:34 np0005474864 nova_compute[192593]: 2025-10-07 20:29:34.580 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:29:35 np0005474864 nova_compute[192593]: 2025-10-07 20:29:35.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:35 np0005474864 podman[235844]: 2025-10-07 20:29:35.404869345 +0000 UTC m=+0.085259563 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0)
Oct  7 16:29:35 np0005474864 nova_compute[192593]: 2025-10-07 20:29:35.576 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:36 np0005474864 nova_compute[192593]: 2025-10-07 20:29:36.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:39 np0005474864 podman[235864]: 2025-10-07 20:29:39.362459497 +0000 UTC m=+0.052904034 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:29:39 np0005474864 nova_compute[192593]: 2025-10-07 20:29:39.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:40 np0005474864 nova_compute[192593]: 2025-10-07 20:29:40.077 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:40 np0005474864 nova_compute[192593]: 2025-10-07 20:29:40.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:40 np0005474864 nova_compute[192593]: 2025-10-07 20:29:40.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:41 np0005474864 nova_compute[192593]: 2025-10-07 20:29:41.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:41 np0005474864 nova_compute[192593]: 2025-10-07 20:29:41.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:29:41 np0005474864 nova_compute[192593]: 2025-10-07 20:29:41.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:29:41 np0005474864 nova_compute[192593]: 2025-10-07 20:29:41.115 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:29:41 np0005474864 nova_compute[192593]: 2025-10-07 20:29:41.115 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:41 np0005474864 nova_compute[192593]: 2025-10-07 20:29:41.115 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:29:42 np0005474864 nova_compute[192593]: 2025-10-07 20:29:42.111 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:44 np0005474864 podman[235888]: 2025-10-07 20:29:44.38269799 +0000 UTC m=+0.071414731 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Oct  7 16:29:44 np0005474864 nova_compute[192593]: 2025-10-07 20:29:44.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:45 np0005474864 nova_compute[192593]: 2025-10-07 20:29:45.080 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:45 np0005474864 nova_compute[192593]: 2025-10-07 20:29:45.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:29:49 np0005474864 nova_compute[192593]: 2025-10-07 20:29:49.437 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:50 np0005474864 nova_compute[192593]: 2025-10-07 20:29:50.083 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:54 np0005474864 nova_compute[192593]: 2025-10-07 20:29:54.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:55 np0005474864 nova_compute[192593]: 2025-10-07 20:29:55.113 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:29:58 np0005474864 podman[235908]: 2025-10-07 20:29:58.381007526 +0000 UTC m=+0.073156961 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:29:58 np0005474864 podman[235909]: 2025-10-07 20:29:58.403964532 +0000 UTC m=+0.091360959 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, vendor=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_id=edpm, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, managed_by=edpm_ansible)
Oct  7 16:29:59 np0005474864 nova_compute[192593]: 2025-10-07 20:29:59.440 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:00 np0005474864 nova_compute[192593]: 2025-10-07 20:30:00.152 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:02 np0005474864 podman[235954]: 2025-10-07 20:30:02.366298212 +0000 UTC m=+0.064988674 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=iscsid, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  7 16:30:02 np0005474864 podman[235956]: 2025-10-07 20:30:02.394756847 +0000 UTC m=+0.083842941 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251001, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:30:02 np0005474864 podman[235955]: 2025-10-07 20:30:02.397226298 +0000 UTC m=+0.095401645 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Oct  7 16:30:04 np0005474864 nova_compute[192593]: 2025-10-07 20:30:04.443 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:05 np0005474864 nova_compute[192593]: 2025-10-07 20:30:05.153 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:06 np0005474864 podman[236016]: 2025-10-07 20:30:06.383925065 +0000 UTC m=+0.074510231 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:30:09 np0005474864 nova_compute[192593]: 2025-10-07 20:30:09.445 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:10 np0005474864 nova_compute[192593]: 2025-10-07 20:30:10.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:10 np0005474864 podman[236036]: 2025-10-07 20:30:10.391531338 +0000 UTC m=+0.084242713 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:30:14 np0005474864 nova_compute[192593]: 2025-10-07 20:30:14.447 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:15 np0005474864 nova_compute[192593]: 2025-10-07 20:30:15.157 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:15 np0005474864 podman[236059]: 2025-10-07 20:30:15.399710481 +0000 UTC m=+0.093275194 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, io.buildah.version=1.41.3)
Oct  7 16:30:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:30:16.208 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:30:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:30:16.208 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:30:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:30:16.208 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:30:19 np0005474864 nova_compute[192593]: 2025-10-07 20:30:19.455 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:20 np0005474864 nova_compute[192593]: 2025-10-07 20:30:20.198 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:24 np0005474864 nova_compute[192593]: 2025-10-07 20:30:24.457 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:25 np0005474864 nova_compute[192593]: 2025-10-07 20:30:25.201 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:29 np0005474864 podman[236080]: 2025-10-07 20:30:29.392649631 +0000 UTC m=+0.075551940 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:30:29 np0005474864 podman[236081]: 2025-10-07 20:30:29.394203347 +0000 UTC m=+0.076723905 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vcs-type=git, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_id=edpm, distribution-scope=public, com.redhat.component=ubi9-minimal-container)
Oct  7 16:30:29 np0005474864 nova_compute[192593]: 2025-10-07 20:30:29.514 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:30 np0005474864 nova_compute[192593]: 2025-10-07 20:30:30.205 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.259 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.260 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:30:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:30:33 np0005474864 podman[236124]: 2025-10-07 20:30:33.402805577 +0000 UTC m=+0.087471746 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, container_name=iscsid, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:30:33 np0005474864 podman[236126]: 2025-10-07 20:30:33.419678956 +0000 UTC m=+0.087778355 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:30:33 np0005474864 podman[236125]: 2025-10-07 20:30:33.465286118 +0000 UTC m=+0.141363438 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  7 16:30:34 np0005474864 nova_compute[192593]: 2025-10-07 20:30:34.517 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:35 np0005474864 nova_compute[192593]: 2025-10-07 20:30:35.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:35 np0005474864 nova_compute[192593]: 2025-10-07 20:30:35.207 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.120 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.120 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.121 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.121 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.387 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.390 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5727MB free_disk=73.4545783996582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.390 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.391 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.557 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.558 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.585 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing inventories for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.606 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating ProviderTree inventory for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.606 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Updating inventory in ProviderTree for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.624 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing aggregate associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.663 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Refreshing trait associations for resource provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4, traits: COMPUTE_NET_VIF_MODEL_PCNET,COMPUTE_STORAGE_BUS_FDC,COMPUTE_RESCUE_BFV,HW_CPU_X86_SSE,COMPUTE_NODE,COMPUTE_GRAPHICS_MODEL_BOCHS,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_TRUSTED_CERTS,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_DEVICE_TAGGING,COMPUTE_VOLUME_EXTEND,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_STORAGE_BUS_USB,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,COMPUTE_SECURITY_TPM_1_2,HW_CPU_X86_SSSE3,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_VIRTIO,HW_CPU_X86_SSE41,COMPUTE_ACCELERATORS,HW_CPU_X86_MMX,COMPUTE_IMAGE_TYPE_ARI,HW_CPU_X86_SSE42,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_SECURITY_UEFI_SECURE_BOOT,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_SECURITY_TPM_2_0 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.705 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.721 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.723 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:30:36 np0005474864 nova_compute[192593]: 2025-10-07 20:30:36.724 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.333s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:30:37 np0005474864 podman[236189]: 2025-10-07 20:30:37.418512475 +0000 UTC m=+0.103764829 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent)
Oct  7 16:30:39 np0005474864 nova_compute[192593]: 2025-10-07 20:30:39.555 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:40 np0005474864 nova_compute[192593]: 2025-10-07 20:30:40.209 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:40 np0005474864 nova_compute[192593]: 2025-10-07 20:30:40.725 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:41 np0005474864 nova_compute[192593]: 2025-10-07 20:30:41.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:41 np0005474864 nova_compute[192593]: 2025-10-07 20:30:41.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:30:41 np0005474864 nova_compute[192593]: 2025-10-07 20:30:41.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:30:41 np0005474864 nova_compute[192593]: 2025-10-07 20:30:41.114 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:30:41 np0005474864 nova_compute[192593]: 2025-10-07 20:30:41.115 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:41 np0005474864 nova_compute[192593]: 2025-10-07 20:30:41.115 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:30:41 np0005474864 podman[236208]: 2025-10-07 20:30:41.390239167 +0000 UTC m=+0.075546310 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:30:42 np0005474864 nova_compute[192593]: 2025-10-07 20:30:42.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:44 np0005474864 nova_compute[192593]: 2025-10-07 20:30:44.558 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:45 np0005474864 nova_compute[192593]: 2025-10-07 20:30:45.211 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:46 np0005474864 podman[236232]: 2025-10-07 20:30:46.392569842 +0000 UTC m=+0.083700017 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3)
Oct  7 16:30:47 np0005474864 nova_compute[192593]: 2025-10-07 20:30:47.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:30:49 np0005474864 nova_compute[192593]: 2025-10-07 20:30:49.581 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:50 np0005474864 nova_compute[192593]: 2025-10-07 20:30:50.214 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:54 np0005474864 nova_compute[192593]: 2025-10-07 20:30:54.585 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:55 np0005474864 nova_compute[192593]: 2025-10-07 20:30:55.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:30:59 np0005474864 nova_compute[192593]: 2025-10-07 20:30:59.610 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:00 np0005474864 nova_compute[192593]: 2025-10-07 20:31:00.250 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:00 np0005474864 podman[236254]: 2025-10-07 20:31:00.409684714 +0000 UTC m=+0.091167704 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, config_id=edpm, version=9.6, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Oct  7 16:31:00 np0005474864 podman[236253]: 2025-10-07 20:31:00.434892624 +0000 UTC m=+0.118082593 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:31:04 np0005474864 podman[236295]: 2025-10-07 20:31:04.422465666 +0000 UTC m=+0.101804652 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=iscsid)
Oct  7 16:31:04 np0005474864 podman[236297]: 2025-10-07 20:31:04.431443336 +0000 UTC m=+0.100771572 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  7 16:31:04 np0005474864 podman[236296]: 2025-10-07 20:31:04.479651203 +0000 UTC m=+0.152007287 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:31:04 np0005474864 nova_compute[192593]: 2025-10-07 20:31:04.611 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:05 np0005474864 nova_compute[192593]: 2025-10-07 20:31:05.252 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:08 np0005474864 podman[236358]: 2025-10-07 20:31:08.378458552 +0000 UTC m=+0.076008823 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0)
Oct  7 16:31:09 np0005474864 nova_compute[192593]: 2025-10-07 20:31:09.646 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:10 np0005474864 nova_compute[192593]: 2025-10-07 20:31:10.254 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:12 np0005474864 podman[236378]: 2025-10-07 20:31:12.382323126 +0000 UTC m=+0.072578205 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Oct  7 16:31:14 np0005474864 nova_compute[192593]: 2025-10-07 20:31:14.650 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:15 np0005474864 nova_compute[192593]: 2025-10-07 20:31:15.256 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:31:16.208 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:31:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:31:16.209 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:31:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:31:16.209 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:31:17 np0005474864 podman[236403]: 2025-10-07 20:31:17.404127764 +0000 UTC m=+0.093705807 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  7 16:31:19 np0005474864 nova_compute[192593]: 2025-10-07 20:31:19.681 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:20 np0005474864 nova_compute[192593]: 2025-10-07 20:31:20.257 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:24 np0005474864 nova_compute[192593]: 2025-10-07 20:31:24.719 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:25 np0005474864 nova_compute[192593]: 2025-10-07 20:31:25.259 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:29 np0005474864 nova_compute[192593]: 2025-10-07 20:31:29.762 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:30 np0005474864 nova_compute[192593]: 2025-10-07 20:31:30.262 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:31 np0005474864 podman[236424]: 2025-10-07 20:31:31.416085987 +0000 UTC m=+0.102151972 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, vcs-type=git, vendor=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Oct  7 16:31:31 np0005474864 podman[236423]: 2025-10-07 20:31:31.438754094 +0000 UTC m=+0.127121775 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:31:34 np0005474864 nova_compute[192593]: 2025-10-07 20:31:34.781 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:35 np0005474864 nova_compute[192593]: 2025-10-07 20:31:35.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:35 np0005474864 nova_compute[192593]: 2025-10-07 20:31:35.264 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:35 np0005474864 podman[236468]: 2025-10-07 20:31:35.384637617 +0000 UTC m=+0.081653888 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_id=iscsid)
Oct  7 16:31:35 np0005474864 podman[236470]: 2025-10-07 20:31:35.394053 +0000 UTC m=+0.081891905 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.license=GPLv2, tcib_managed=true)
Oct  7 16:31:35 np0005474864 podman[236469]: 2025-10-07 20:31:35.39784104 +0000 UTC m=+0.094880281 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  7 16:31:36 np0005474864 nova_compute[192593]: 2025-10-07 20:31:36.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.121 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.122 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.123 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.123 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.351 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.353 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5717MB free_disk=73.4545783996582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.353 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.353 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.420 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.421 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.459 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.478 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.481 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:31:38 np0005474864 nova_compute[192593]: 2025-10-07 20:31:38.481 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:31:39 np0005474864 podman[236533]: 2025-10-07 20:31:39.385340899 +0000 UTC m=+0.072405559 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:31:39 np0005474864 nova_compute[192593]: 2025-10-07 20:31:39.841 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:40 np0005474864 nova_compute[192593]: 2025-10-07 20:31:40.266 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:40 np0005474864 nova_compute[192593]: 2025-10-07 20:31:40.482 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:42 np0005474864 nova_compute[192593]: 2025-10-07 20:31:42.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:42 np0005474864 nova_compute[192593]: 2025-10-07 20:31:42.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:31:42 np0005474864 nova_compute[192593]: 2025-10-07 20:31:42.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:31:42 np0005474864 nova_compute[192593]: 2025-10-07 20:31:42.129 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:31:42 np0005474864 nova_compute[192593]: 2025-10-07 20:31:42.130 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:42 np0005474864 nova_compute[192593]: 2025-10-07 20:31:42.130 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:42 np0005474864 nova_compute[192593]: 2025-10-07 20:31:42.131 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:31:43 np0005474864 nova_compute[192593]: 2025-10-07 20:31:43.126 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:43 np0005474864 podman[236552]: 2025-10-07 20:31:43.391375755 +0000 UTC m=+0.077886578 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:31:44 np0005474864 nova_compute[192593]: 2025-10-07 20:31:44.896 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:45 np0005474864 nova_compute[192593]: 2025-10-07 20:31:45.268 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:48 np0005474864 nova_compute[192593]: 2025-10-07 20:31:48.091 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:31:48 np0005474864 podman[236576]: 2025-10-07 20:31:48.390496236 +0000 UTC m=+0.077322552 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Oct  7 16:31:49 np0005474864 nova_compute[192593]: 2025-10-07 20:31:49.902 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:50 np0005474864 nova_compute[192593]: 2025-10-07 20:31:50.271 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:54 np0005474864 nova_compute[192593]: 2025-10-07 20:31:54.934 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:55 np0005474864 nova_compute[192593]: 2025-10-07 20:31:55.273 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:31:59 np0005474864 nova_compute[192593]: 2025-10-07 20:31:59.941 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:00 np0005474864 nova_compute[192593]: 2025-10-07 20:32:00.277 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:02 np0005474864 podman[236599]: 2025-10-07 20:32:02.389144615 +0000 UTC m=+0.073588862 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:32:02 np0005474864 podman[236600]: 2025-10-07 20:32:02.410598577 +0000 UTC m=+0.091030768 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, architecture=x86_64)
Oct  7 16:32:04 np0005474864 nova_compute[192593]: 2025-10-07 20:32:04.946 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:05 np0005474864 nova_compute[192593]: 2025-10-07 20:32:05.278 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:06 np0005474864 podman[236644]: 2025-10-07 20:32:06.402340767 +0000 UTC m=+0.084900092 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=iscsid, managed_by=edpm_ansible, org.label-schema.build-date=20251001, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Oct  7 16:32:06 np0005474864 podman[236646]: 2025-10-07 20:32:06.445705044 +0000 UTC m=+0.111498563 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.build-date=20251001)
Oct  7 16:32:06 np0005474864 podman[236645]: 2025-10-07 20:32:06.473556341 +0000 UTC m=+0.145024075 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:32:09 np0005474864 nova_compute[192593]: 2025-10-07 20:32:09.949 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:10 np0005474864 nova_compute[192593]: 2025-10-07 20:32:10.281 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:10 np0005474864 podman[236709]: 2025-10-07 20:32:10.387979942 +0000 UTC m=+0.078156276 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Oct  7 16:32:14 np0005474864 podman[236729]: 2025-10-07 20:32:14.396063828 +0000 UTC m=+0.084382427 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:32:14 np0005474864 nova_compute[192593]: 2025-10-07 20:32:14.953 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:15 np0005474864 nova_compute[192593]: 2025-10-07 20:32:15.283 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:15 np0005474864 nova_compute[192593]: 2025-10-07 20:32:15.575 2 DEBUG oslo_concurrency.processutils [None req-34a42e8c-987e-4a15-a414-27db18c65261 137e1e151f7a4796bc1733e1911b8acf 0382839ef71e4276aa5e57e3b819687c - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Oct  7 16:32:15 np0005474864 nova_compute[192593]: 2025-10-07 20:32:15.618 2 DEBUG oslo_concurrency.processutils [None req-34a42e8c-987e-4a15-a414-27db18c65261 137e1e151f7a4796bc1733e1911b8acf 0382839ef71e4276aa5e57e3b819687c - - default default] CMD "env LANG=C uptime" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Oct  7 16:32:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:32:16.209 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:32:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:32:16.210 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:32:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:32:16.210 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:32:19 np0005474864 podman[236754]: 2025-10-07 20:32:19.405547359 +0000 UTC m=+0.090534007 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, container_name=ceilometer_agent_compute)
Oct  7 16:32:19 np0005474864 nova_compute[192593]: 2025-10-07 20:32:19.958 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:20 np0005474864 nova_compute[192593]: 2025-10-07 20:32:20.286 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:32:21.658 103685 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'ce:76:a3', 'max_tunid': '16711680', 'northd_internal_version': '24.03.7-20.33.0-76.8', 'svc_monitor_mac': 'ba:bb:91:e8:2b:5d'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Oct  7 16:32:21 np0005474864 nova_compute[192593]: 2025-10-07 20:32:21.659 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:21 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:32:21.660 103685 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Oct  7 16:32:25 np0005474864 nova_compute[192593]: 2025-10-07 20:32:25.000 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:25 np0005474864 nova_compute[192593]: 2025-10-07 20:32:25.290 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:25 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:32:25.662 103685 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=2d917af9-e2c2-4b32-93ba-e5708271f327, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Oct  7 16:32:30 np0005474864 nova_compute[192593]: 2025-10-07 20:32:30.004 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:30 np0005474864 nova_compute[192593]: 2025-10-07 20:32:30.291 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.265 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:32:31.266 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:32:33 np0005474864 podman[236776]: 2025-10-07 20:32:33.393422378 +0000 UTC m=+0.083030871 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Oct  7 16:32:33 np0005474864 podman[236777]: 2025-10-07 20:32:33.411162858 +0000 UTC m=+0.094305735 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, io.openshift.expose-services=, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_id=edpm, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, distribution-scope=public, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git)
Oct  7 16:32:35 np0005474864 nova_compute[192593]: 2025-10-07 20:32:35.008 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:35 np0005474864 nova_compute[192593]: 2025-10-07 20:32:35.293 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:36 np0005474864 nova_compute[192593]: 2025-10-07 20:32:36.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:36 np0005474864 nova_compute[192593]: 2025-10-07 20:32:36.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:37 np0005474864 podman[236819]: 2025-10-07 20:32:37.398325487 +0000 UTC m=+0.079646163 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=multipathd)
Oct  7 16:32:37 np0005474864 podman[236817]: 2025-10-07 20:32:37.401022245 +0000 UTC m=+0.085552454 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, container_name=iscsid, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Oct  7 16:32:37 np0005474864 podman[236818]: 2025-10-07 20:32:37.405993508 +0000 UTC m=+0.094879002 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3)
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.088 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.146 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.146 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.147 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.147 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.341 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.342 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5722MB free_disk=73.45455932617188GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.343 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.343 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.429 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.430 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.462 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.479 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.481 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:32:38 np0005474864 nova_compute[192593]: 2025-10-07 20:32:38.482 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:32:40 np0005474864 nova_compute[192593]: 2025-10-07 20:32:40.011 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:40 np0005474864 nova_compute[192593]: 2025-10-07 20:32:40.295 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:40 np0005474864 nova_compute[192593]: 2025-10-07 20:32:40.483 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:41 np0005474864 podman[236881]: 2025-10-07 20:32:41.388709228 +0000 UTC m=+0.082265959 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true)
Oct  7 16:32:42 np0005474864 nova_compute[192593]: 2025-10-07 20:32:42.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:42 np0005474864 nova_compute[192593]: 2025-10-07 20:32:42.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:32:43 np0005474864 nova_compute[192593]: 2025-10-07 20:32:43.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:43 np0005474864 nova_compute[192593]: 2025-10-07 20:32:43.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:32:43 np0005474864 nova_compute[192593]: 2025-10-07 20:32:43.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:32:43 np0005474864 nova_compute[192593]: 2025-10-07 20:32:43.111 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:32:44 np0005474864 nova_compute[192593]: 2025-10-07 20:32:44.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:45 np0005474864 nova_compute[192593]: 2025-10-07 20:32:45.017 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:45 np0005474864 nova_compute[192593]: 2025-10-07 20:32:45.297 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:45 np0005474864 podman[236901]: 2025-10-07 20:32:45.429740077 +0000 UTC m=+0.114291090 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:32:49 np0005474864 nova_compute[192593]: 2025-10-07 20:32:49.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:32:50 np0005474864 nova_compute[192593]: 2025-10-07 20:32:50.021 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:50 np0005474864 nova_compute[192593]: 2025-10-07 20:32:50.299 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:50 np0005474864 podman[236925]: 2025-10-07 20:32:50.376816134 +0000 UTC m=+0.077068329 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3)
Oct  7 16:32:55 np0005474864 nova_compute[192593]: 2025-10-07 20:32:55.048 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:32:55 np0005474864 nova_compute[192593]: 2025-10-07 20:32:55.300 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:00 np0005474864 nova_compute[192593]: 2025-10-07 20:33:00.050 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:00 np0005474864 nova_compute[192593]: 2025-10-07 20:33:00.302 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:04 np0005474864 podman[236945]: 2025-10-07 20:33:04.359877414 +0000 UTC m=+0.064023494 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:33:04 np0005474864 podman[236946]: 2025-10-07 20:33:04.374700891 +0000 UTC m=+0.074542997 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, version=9.6, maintainer=Red Hat, Inc., name=ubi9-minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Oct  7 16:33:05 np0005474864 nova_compute[192593]: 2025-10-07 20:33:05.053 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:05 np0005474864 nova_compute[192593]: 2025-10-07 20:33:05.304 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:08 np0005474864 podman[236991]: 2025-10-07 20:33:08.40464172 +0000 UTC m=+0.091677309 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=iscsid, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Oct  7 16:33:08 np0005474864 podman[236993]: 2025-10-07 20:33:08.415637897 +0000 UTC m=+0.091162285 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, container_name=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:33:08 np0005474864 podman[236992]: 2025-10-07 20:33:08.491515111 +0000 UTC m=+0.173593928 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001)
Oct  7 16:33:10 np0005474864 nova_compute[192593]: 2025-10-07 20:33:10.054 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:10 np0005474864 nova_compute[192593]: 2025-10-07 20:33:10.306 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:12 np0005474864 podman[237054]: 2025-10-07 20:33:12.393688914 +0000 UTC m=+0.083321740 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:33:15 np0005474864 nova_compute[192593]: 2025-10-07 20:33:15.057 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:15 np0005474864 nova_compute[192593]: 2025-10-07 20:33:15.307 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:16 np0005474864 nova_compute[192593]: 2025-10-07 20:33:16.052 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:33:16.210 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:33:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:33:16.211 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:33:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:33:16.212 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:33:16 np0005474864 podman[237074]: 2025-10-07 20:33:16.396633717 +0000 UTC m=+0.080093537 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:33:20 np0005474864 nova_compute[192593]: 2025-10-07 20:33:20.061 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:20 np0005474864 nova_compute[192593]: 2025-10-07 20:33:20.310 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:21 np0005474864 podman[237098]: 2025-10-07 20:33:21.400502528 +0000 UTC m=+0.093182383 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Oct  7 16:33:25 np0005474864 nova_compute[192593]: 2025-10-07 20:33:25.086 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:25 np0005474864 nova_compute[192593]: 2025-10-07 20:33:25.311 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:30 np0005474864 nova_compute[192593]: 2025-10-07 20:33:30.091 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:30 np0005474864 nova_compute[192593]: 2025-10-07 20:33:30.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:32 np0005474864 nova_compute[192593]: 2025-10-07 20:33:32.021 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:35 np0005474864 nova_compute[192593]: 2025-10-07 20:33:35.095 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:35 np0005474864 nova_compute[192593]: 2025-10-07 20:33:35.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:35 np0005474864 podman[237120]: 2025-10-07 20:33:35.380567853 +0000 UTC m=+0.070665445 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, config_id=edpm)
Oct  7 16:33:35 np0005474864 podman[237119]: 2025-10-07 20:33:35.395101612 +0000 UTC m=+0.080928811 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:33:37 np0005474864 nova_compute[192593]: 2025-10-07 20:33:37.128 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.094 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.095 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.133 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.134 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.134 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.134 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.381 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.383 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5728MB free_disk=73.4545783996582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.383 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.384 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.586 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.587 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.618 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.635 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.637 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:33:38 np0005474864 nova_compute[192593]: 2025-10-07 20:33:38.638 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.254s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:33:39 np0005474864 podman[237165]: 2025-10-07 20:33:39.407778253 +0000 UTC m=+0.093632666 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_id=iscsid, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=iscsid)
Oct  7 16:33:39 np0005474864 podman[237167]: 2025-10-07 20:33:39.424827394 +0000 UTC m=+0.102636375 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd)
Oct  7 16:33:39 np0005474864 podman[237166]: 2025-10-07 20:33:39.463379284 +0000 UTC m=+0.143844811 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Oct  7 16:33:39 np0005474864 nova_compute[192593]: 2025-10-07 20:33:39.632 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:40 np0005474864 nova_compute[192593]: 2025-10-07 20:33:40.098 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:40 np0005474864 nova_compute[192593]: 2025-10-07 20:33:40.316 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:41 np0005474864 nova_compute[192593]: 2025-10-07 20:33:41.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:43 np0005474864 podman[237223]: 2025-10-07 20:33:43.387015834 +0000 UTC m=+0.081361083 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:33:44 np0005474864 nova_compute[192593]: 2025-10-07 20:33:44.087 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:44 np0005474864 nova_compute[192593]: 2025-10-07 20:33:44.105 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:44 np0005474864 nova_compute[192593]: 2025-10-07 20:33:44.105 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:44 np0005474864 nova_compute[192593]: 2025-10-07 20:33:44.106 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:33:45 np0005474864 nova_compute[192593]: 2025-10-07 20:33:45.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:45 np0005474864 nova_compute[192593]: 2025-10-07 20:33:45.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:33:45 np0005474864 nova_compute[192593]: 2025-10-07 20:33:45.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:33:45 np0005474864 nova_compute[192593]: 2025-10-07 20:33:45.100 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:45 np0005474864 nova_compute[192593]: 2025-10-07 20:33:45.108 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:33:45 np0005474864 nova_compute[192593]: 2025-10-07 20:33:45.320 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:47 np0005474864 nova_compute[192593]: 2025-10-07 20:33:47.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:47 np0005474864 nova_compute[192593]: 2025-10-07 20:33:47.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Oct  7 16:33:47 np0005474864 podman[237242]: 2025-10-07 20:33:47.399235594 +0000 UTC m=+0.084243355 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:33:49 np0005474864 nova_compute[192593]: 2025-10-07 20:33:49.113 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:33:50 np0005474864 nova_compute[192593]: 2025-10-07 20:33:50.155 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:50 np0005474864 nova_compute[192593]: 2025-10-07 20:33:50.323 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:52 np0005474864 podman[237267]: 2025-10-07 20:33:52.409330645 +0000 UTC m=+0.103634464 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:33:55 np0005474864 nova_compute[192593]: 2025-10-07 20:33:55.193 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:33:55 np0005474864 nova_compute[192593]: 2025-10-07 20:33:55.325 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:00 np0005474864 nova_compute[192593]: 2025-10-07 20:34:00.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:00 np0005474864 nova_compute[192593]: 2025-10-07 20:34:00.093 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Oct  7 16:34:00 np0005474864 nova_compute[192593]: 2025-10-07 20:34:00.118 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Oct  7 16:34:00 np0005474864 nova_compute[192593]: 2025-10-07 20:34:00.236 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:00 np0005474864 nova_compute[192593]: 2025-10-07 20:34:00.327 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:01 np0005474864 nova_compute[192593]: 2025-10-07 20:34:01.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:05 np0005474864 nova_compute[192593]: 2025-10-07 20:34:05.240 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:05 np0005474864 nova_compute[192593]: 2025-10-07 20:34:05.329 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:06 np0005474864 podman[237287]: 2025-10-07 20:34:06.398647798 +0000 UTC m=+0.083360981 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:34:06 np0005474864 podman[237288]: 2025-10-07 20:34:06.428825866 +0000 UTC m=+0.106021652 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, name=ubi9-minimal, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, config_id=edpm, vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Oct  7 16:34:10 np0005474864 nova_compute[192593]: 2025-10-07 20:34:10.243 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:10 np0005474864 nova_compute[192593]: 2025-10-07 20:34:10.334 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:10 np0005474864 podman[237333]: 2025-10-07 20:34:10.407801889 +0000 UTC m=+0.094667825 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_id=iscsid, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:34:10 np0005474864 podman[237335]: 2025-10-07 20:34:10.421672199 +0000 UTC m=+0.098998711 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Oct  7 16:34:10 np0005474864 podman[237334]: 2025-10-07 20:34:10.488233985 +0000 UTC m=+0.169722537 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Oct  7 16:34:14 np0005474864 podman[237394]: 2025-10-07 20:34:14.393182827 +0000 UTC m=+0.088403016 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251001, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:34:15 np0005474864 nova_compute[192593]: 2025-10-07 20:34:15.247 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:15 np0005474864 nova_compute[192593]: 2025-10-07 20:34:15.335 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:34:16.212 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:34:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:34:16.212 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:34:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:34:16.213 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:34:18 np0005474864 podman[237414]: 2025-10-07 20:34:18.409720351 +0000 UTC m=+0.102522502 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:34:20 np0005474864 nova_compute[192593]: 2025-10-07 20:34:20.285 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:20 np0005474864 nova_compute[192593]: 2025-10-07 20:34:20.337 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:23 np0005474864 podman[237437]: 2025-10-07 20:34:23.393606066 +0000 UTC m=+0.084000968 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251001, org.label-schema.vendor=CentOS)
Oct  7 16:34:25 np0005474864 nova_compute[192593]: 2025-10-07 20:34:25.314 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:25 np0005474864 nova_compute[192593]: 2025-10-07 20:34:25.339 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:30 np0005474864 nova_compute[192593]: 2025-10-07 20:34:30.367 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.261 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.262 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.263 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:31 np0005474864 ceilometer_agent_compute[203419]: 2025-10-07 20:34:31.264 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Oct  7 16:34:35 np0005474864 nova_compute[192593]: 2025-10-07 20:34:35.368 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:34:35 np0005474864 nova_compute[192593]: 2025-10-07 20:34:35.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:35 np0005474864 nova_compute[192593]: 2025-10-07 20:34:35.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:34:35 np0005474864 nova_compute[192593]: 2025-10-07 20:34:35.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:35 np0005474864 nova_compute[192593]: 2025-10-07 20:34:35.370 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:35 np0005474864 nova_compute[192593]: 2025-10-07 20:34:35.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:37 np0005474864 nova_compute[192593]: 2025-10-07 20:34:37.110 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:37 np0005474864 podman[237457]: 2025-10-07 20:34:37.405795694 +0000 UTC m=+0.088113417 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Oct  7 16:34:37 np0005474864 podman[237458]: 2025-10-07 20:34:37.422231857 +0000 UTC m=+0.099922757 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, distribution-scope=public, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers)
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.136 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.137 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.138 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.138 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.379 2 WARNING nova.virt.libvirt.driver [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.381 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5715MB free_disk=73.4545783996582GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.381 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.382 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.481 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.482 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.520 2 DEBUG nova.compute.provider_tree [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed in ProviderTree for provider: 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.542 2 DEBUG nova.scheduler.client.report [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Inventory has not changed for provider 63545c2e-7bb7-4b7a-9af2-ee768bda9cb4 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.544 2 DEBUG nova.compute.resource_tracker [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Oct  7 16:34:38 np0005474864 nova_compute[192593]: 2025-10-07 20:34:38.544 2 DEBUG oslo_concurrency.lockutils [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.163s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:34:40 np0005474864 nova_compute[192593]: 2025-10-07 20:34:40.371 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:41 np0005474864 podman[237506]: 2025-10-07 20:34:41.40846363 +0000 UTC m=+0.096164429 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251001, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, container_name=iscsid, managed_by=edpm_ansible, tcib_managed=true, config_id=iscsid, org.label-schema.license=GPLv2)
Oct  7 16:34:41 np0005474864 podman[237508]: 2025-10-07 20:34:41.419805986 +0000 UTC m=+0.093975106 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Oct  7 16:34:41 np0005474864 podman[237507]: 2025-10-07 20:34:41.487637618 +0000 UTC m=+0.166516763 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297)
Oct  7 16:34:41 np0005474864 nova_compute[192593]: 2025-10-07 20:34:41.540 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:42 np0005474864 nova_compute[192593]: 2025-10-07 20:34:42.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:44 np0005474864 nova_compute[192593]: 2025-10-07 20:34:44.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:44 np0005474864 nova_compute[192593]: 2025-10-07 20:34:44.092 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.093 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.094 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.111 2 DEBUG nova.compute.manager [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.112 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.373 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.374 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.375 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:45 np0005474864 nova_compute[192593]: 2025-10-07 20:34:45.376 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:45 np0005474864 podman[237569]: 2025-10-07 20:34:45.431694746 +0000 UTC m=+0.120456128 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251001, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Oct  7 16:34:49 np0005474864 nova_compute[192593]: 2025-10-07 20:34:49.092 2 DEBUG oslo_service.periodic_task [None req-65394376-6377-42a3-b227-16ef9623b8d5 - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Oct  7 16:34:49 np0005474864 podman[237587]: 2025-10-07 20:34:49.396224114 +0000 UTC m=+0.089820156 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Oct  7 16:34:50 np0005474864 nova_compute[192593]: 2025-10-07 20:34:50.378 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:34:50 np0005474864 nova_compute[192593]: 2025-10-07 20:34:50.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:34:50 np0005474864 nova_compute[192593]: 2025-10-07 20:34:50.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:34:50 np0005474864 nova_compute[192593]: 2025-10-07 20:34:50.379 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:50 np0005474864 nova_compute[192593]: 2025-10-07 20:34:50.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:50 np0005474864 nova_compute[192593]: 2025-10-07 20:34:50.431 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:54 np0005474864 podman[237613]: 2025-10-07 20:34:54.380791489 +0000 UTC m=+0.076466921 container health_status b90a984278f1db3d3c53333fb4339512e6884c4879bb56ebdb3896abd756917a (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251001, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3)
Oct  7 16:34:55 np0005474864 nova_compute[192593]: 2025-10-07 20:34:55.432 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:34:55 np0005474864 nova_compute[192593]: 2025-10-07 20:34:55.433 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:34:55 np0005474864 nova_compute[192593]: 2025-10-07 20:34:55.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:34:55 np0005474864 nova_compute[192593]: 2025-10-07 20:34:55.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:55 np0005474864 nova_compute[192593]: 2025-10-07 20:34:55.434 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:34:55 np0005474864 nova_compute[192593]: 2025-10-07 20:34:55.435 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:35:00 np0005474864 nova_compute[192593]: 2025-10-07 20:35:00.436 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:35:00 np0005474864 nova_compute[192593]: 2025-10-07 20:35:00.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:35:00 np0005474864 nova_compute[192593]: 2025-10-07 20:35:00.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:35:00 np0005474864 nova_compute[192593]: 2025-10-07 20:35:00.438 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:00 np0005474864 nova_compute[192593]: 2025-10-07 20:35:00.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:35:00 np0005474864 nova_compute[192593]: 2025-10-07 20:35:00.483 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:05 np0005474864 nova_compute[192593]: 2025-10-07 20:35:05.484 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:35:05 np0005474864 nova_compute[192593]: 2025-10-07 20:35:05.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:35:05 np0005474864 nova_compute[192593]: 2025-10-07 20:35:05.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5003 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:35:05 np0005474864 nova_compute[192593]: 2025-10-07 20:35:05.486 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:05 np0005474864 nova_compute[192593]: 2025-10-07 20:35:05.528 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:35:05 np0005474864 nova_compute[192593]: 2025-10-07 20:35:05.529 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:08 np0005474864 podman[237634]: 2025-10-07 20:35:08.400595009 +0000 UTC m=+0.083218987 container health_status 52d492d05f0e57d349db56b24685bf4a9eee0bc85ec5f441ffce136a1ff06815 (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Oct  7 16:35:08 np0005474864 podman[237635]: 2025-10-07 20:35:08.439647333 +0000 UTC m=+0.117724860 container health_status cf52d47fee9f715998d5a484278772959df6f6266462cd45b7ce3e9cbd66af85 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, release=1755695350, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.tags=minimal rhel9, distribution-scope=public, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, vcs-type=git, com.redhat.component=ubi9-minimal-container)
Oct  7 16:35:10 np0005474864 nova_compute[192593]: 2025-10-07 20:35:10.530 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:35:10 np0005474864 nova_compute[192593]: 2025-10-07 20:35:10.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:35:10 np0005474864 nova_compute[192593]: 2025-10-07 20:35:10.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5002 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:35:10 np0005474864 nova_compute[192593]: 2025-10-07 20:35:10.532 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:10 np0005474864 nova_compute[192593]: 2025-10-07 20:35:10.565 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:35:10 np0005474864 nova_compute[192593]: 2025-10-07 20:35:10.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:11 np0005474864 systemd-logind[805]: New session 33 of user zuul.
Oct  7 16:35:11 np0005474864 systemd[1]: Started Session 33 of User zuul.
Oct  7 16:35:11 np0005474864 podman[237681]: 2025-10-07 20:35:11.874587956 +0000 UTC m=+0.109616026 container health_status 6d9b72e66b5d71a9ccad8f23eb9a089cc5c077cd651d5ae6785fb288edfdfb1b (image=quay.io/podified-antelope-centos9/openstack-iscsid:current-podified, name=iscsid, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/iscsid', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-iscsid:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/iscsid.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run:/run', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:z', '/etc/target:/etc/target:z', '/var/lib/iscsi:/var/lib/iscsi:z', '/var/lib/openstack/healthchecks/iscsid:/openstack:ro,z']}, config_id=iscsid, container_name=iscsid, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Oct  7 16:35:11 np0005474864 podman[237684]: 2025-10-07 20:35:11.874914495 +0000 UTC m=+0.100004389 container health_status ef0c5267b39fbb4d98422dcefa4fb24f0c2adec3d32717bf0d4747fa44ab9434 (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi:z', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, io.buildah.version=1.41.3, org.label-schema.build-date=20251001)
Oct  7 16:35:11 np0005474864 podman[237683]: 2025-10-07 20:35:11.92022653 +0000 UTC m=+0.151873223 container health_status a4ccd63c0e06efc4c04b8fe39db5aa690bed7f7e5dc20b758781ca815c2c9a7a (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_controller, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, org.label-schema.license=GPLv2)
Oct  7 16:35:15 np0005474864 nova_compute[192593]: 2025-10-07 20:35:15.566 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4998-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Oct  7 16:35:15 np0005474864 nova_compute[192593]: 2025-10-07 20:35:15.567 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:35:15 np0005474864 nova_compute[192593]: 2025-10-07 20:35:15.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5001 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Oct  7 16:35:15 np0005474864 nova_compute[192593]: 2025-10-07 20:35:15.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:15 np0005474864 nova_compute[192593]: 2025-10-07 20:35:15.568 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Oct  7 16:35:15 np0005474864 nova_compute[192593]: 2025-10-07 20:35:15.570 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:35:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:35:16.213 103685 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Oct  7 16:35:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:35:16.214 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Oct  7 16:35:16 np0005474864 ovn_metadata_agent[103680]: 2025-10-07 20:35:16.214 103685 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Oct  7 16:35:16 np0005474864 podman[237898]: 2025-10-07 20:35:16.432069869 +0000 UTC m=+0.117586545 container health_status a223c096cb2ff2ba6f3bb530474a0469a8361d1693f09a8a46f4eafd8789459c (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=b78cfc68a577b1553523c8a70a34e297, managed_by=edpm_ansible, org.label-schema.build-date=20251001, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Oct  7 16:35:16 np0005474864 ovs-vsctl[237937]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Oct  7 16:35:17 np0005474864 virtqemud[192092]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Oct  7 16:35:17 np0005474864 virtqemud[192092]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Oct  7 16:35:17 np0005474864 virtqemud[192092]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Oct  7 16:35:20 np0005474864 podman[238446]: 2025-10-07 20:35:20.375067937 +0000 UTC m=+0.065404654 container health_status b9df7db3baf2578e339cdd1571aec173cff6eada3b3b3a1b4c6cd7789b0e4193 (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Oct  7 16:35:20 np0005474864 nova_compute[192593]: 2025-10-07 20:35:20.569 2 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 28 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Oct  7 16:35:21 np0005474864 systemd[1]: Starting Hostname Service...
Oct  7 16:35:21 np0005474864 systemd[1]: Started Hostname Service.
